
4920b61ff038e6490268612a19a6c3bf.ppt
- Количество слайдов: 107
® IBM Software Group Rational Developer for Rational Development and Test Environment for System z Managing Test Data Jon Sayles, IBM/Rational © 2011 IBM Corporation
IBM Trademarks and Copyrights 4 © Copyright IBM Corporation 2007, 2008, 2009, 2010, 2011, 2012. All rights reserved. 4 The information contained in these materials is provided for informational purposes only, and is provided AS IS without warranty of any kind, express or implied. IBM shall not be responsible for any damages arising out of the use of, or otherwise related to, these materials. Nothing contained in these materials is intended to, nor shall have the effect of, creating any warranties or representations from IBM or its suppliers or licensors, or altering the terms and conditions of the applicable license agreement governing the use of IBM software. References in these materials to IBM products, programs, or services do not imply that they will be available in all countries in which IBM operates. 4 This information is based on current IBM product plans and strategy, which are subject to change by IBM without notice. Product release dates and/or capabilities referenced in these materials may change at any time at IBM’s sole discretion based on market opportunities or other factors, and are not intended to be a commitment to future product or feature availability in any way. 4 IBM, the IBM logo, the on-demand business logo, Rational, the Rational logo, and other IBM Rational products and services are trademarks or registered trademarks of the International Business Machines Corporation, in the United States, other countries or both. Other company, product, or service names may be trademarks or service marks of others. 2
Course Contributing Authors § Thanks to the following individuals, for assisting with this course: 4 4 4 David Myers, IBM/Rational John T. Gates, IBM/Rational Doug Nadel, IBM/Rational David Bean, IBM/Rational Chris Rayns, IBM/Rational 3
Course Details § Audience 4 This course is for z/OS applications or systems software developers who are tasked with provisioning (migrating) and maintaining test data in a Rational Developer for System z Unit Test (RD&T) environment. § Prerequisites 4 This course assumes that the participant has a deep understanding of z/OS application development technology, including: § § § Files and databases JCL TSO/e CLISTS and REXX Execs Relational technology and DB 2 – if you are migrating DB 2 data, including SQL and running DB 2 utilities § IMS (DL/I) technology – if you are migrating IMS databases 4 Experience using Rational Developer for System z – z/OS and Data perspective 4 An understanding of RD&T 4 RDz must be installed and configured on your host (z/OS) system 4 Your RD&T product should be installed, operational, and connected to RDz client – and a z/OS mainframe LPAR if you wish to do the hands-on workshops 4 It is highly recommended that you use Rational Asset Analyzer (RAA) for help identifying unit test resources and the relationships among the resources 4
Course Objectives § At the end of this course, you will be able to: 4 Describe the characteristics of your test data environment 4 Migrate DB 2 objects § Definitions § Test table rows § Subset and migrate subsets of DB 2 test table data rows through SQL 4 Migrate QSAM data 4 Migrate VSAM data 4 Migrate IMS (DL/I) databases 5
UNIT Managing Test Data Topics: § Test Data Management – Overview, Terms and Concepts § DB 2 Test Table and Data Migration § QSAM/VSAM Data Migration § IMS DL/I Database Migration § Migrating CICS Transaction Metadata § Migrating IMS Transaction Metadata § Back-up Slides 6
Topic objectives After completing this topic, you should be able to: 4 What is a valid unit test data environment? 4 What data do you provision? 4 The concept of a logical record 4 Volume migration 4 RAA's role 4"Cloning production data" versus "a managed data" environment 4 Protecting (masking) data values 4 Non-IBM databases § IDMS and IDMS/R - http: //idms-training. com/ § Datacom 4 Other (non-standard) IBM databases § Tablebase 4 Refreshing data 4 Data creation – using utilities and 3 rd party products 4 Accessing data on z/OS – through DRDA and CICS transaction routing 7
What is a valid test data environment – for unit test? § Unit Test – is defined by IEEE as "Testing of individual hardware or software units or groups of related units" § The data you need to migrate depends on your application unit testing process requirements: 4 Modules – your programs in COBOL, PL/I and Assembler 4 Driver modules 4 Stub modules § We present this formal definition to distinguish Unit Test from other testing levels in the SDLC (Software Development Lifecycle): 4 Integration test 4 Systems/Regression test 4 QA (Acceptance) test § The quality and quantity of test data rises as you move through the lifecycle 8
Provisioning for Unit Test vs. Integration/Systems/Acceptance Testing § We present this definition to distinguish Unit Test from other testing levels in the SDLC (Software Development Lifecycle) that are out of scope: 4 Integration test 4 Systems/Regression test 4 QA (Acceptance) test § The quantity and quality of test data rises as you move through the lifecycle 4 Unit Test – minimal amount of data for the use cases you are working on, which may include a: – Single program and called sub-modules – Batch job – Transaction 4 Integration Test – all of data values necessary to prove a larger grain unit of work § Multiple batch job streams (a batch cycle) § Multiple transactions 4 Systems/Regression/QA (Acceptance) test – all of the values + data volume necessary to test: § Multiple batch cycles related to a business application § All transactions related to a business application § Enough data volume, to provide stress testing metrics 9
What data do you provision – for RD&T? § Two executional units – data that is required for: 4 Transactions 4 Batch jobs § Transactions data includes: 4 CICS: § Online VSAM files, DB 2 tables, DL/I databases 4 IMS data includes resources accessed in MPP, QBMP transactions: § DL/I databases, Fastpath, DEDB and GSAM databases, DB 2 tables § Batch job data includes: 4 QSAM and VSAM files 4 DL/I databases 4 DB 2 tables § Ultimately – RD&T data provisioning depends on 4 How you define Unit Testing 4 The application entities (programs) you've decided to work on 4 The use cases you intend to work on using RD&T 10
How do I know what data is required for Transactions? § Two approaches: 1. Interview Subject Matter Experts (SME) 2. Discovery through code analysis – which could be done manually, or using RAA 11
How do I know what data is required for Batch Jobs? § Two approaches: 1. Interview Subject Matter Experts (SMEs) 2. Discovery through code analysis – which could be done manually, or using RAA 12
Types of z/OS Test Data § Files and databases: 4 Relational (DB 2) 4 Hierarchical (IMS) 4 VSAM/QSAM 4 Non-IBM files and databases (IDMS/IDMS r, Datacom, Adabas, etc. ) 4 Other IBM files: Table. Base, etc. § In this course we will focus on tools and processes to migrate: 4 DB 2 4 IMS 4 VSAM/QSAM 13
Test data options for your RD&T environment § Two choices: 1. "Cloned" data – typically an image-copy of production – utilized in an existing test subsystem, migrated to RD&T 2. Managed data – subsets of data from an existing test subsystem migrated to RD&T 4 Comments: 4 Both choices have strengths and weaknesses 4 You will need a migration strategy, tools and procedures to move the data efficiently in either case 4 After migration you will have the ability to make "trivial edits" to your migrated files and databases 14
Cloned Data – Benefits § Benefits of cloned data: 4 The DBAs and Systems Programmers in your shop probably have a set of well-defined procedures for migrating LPARS § These procedures can be used to migrate VSAM/QSAM, DB 2 and IMS test data to RD&T § The same set of technical skills can be utilized – Although knowledge of UT's idiosyncrasies and a knowledge of the Linux operating system is extremely important 4 A Cloned Data environment should cover all conditions relating to existing production application specification 4 No need for data subsetting 4 Large data volume could be useful in performance and stress testing 15
Cloned Data – Drawbacks § Drawbacks to cloned data: 4 Typical issues pinning Unit Test results to cause and effect: § Inefficient – to find specific records/rows/segments/value – for conditional data during unit test 4 Complex and time-consuming to change objects (less of an "agile" approach to testing) 4 Good chance that the data is not masked 4 Sheer volume may inhibit migration to RD&T 4 Will require MVS Systems Programmer-level: § Skills § Authorization 16
Managed Data – Drawbacks and Benefits § Benefits of managed data: 4 Increased accuracy of testing 4 Less expensive – by all of the previous criteria 4 Simplifies masking and privacy § Drawbacks of managed data: 4 Might not cover all conditions relating to existing production application specification 4 Test data subsetting can be complex and time-consuming 4 Lack of volume data makes managed subsets less useful for performance and stress test 17
Subsetting and Managed Data Three types of subsetting: 1. File-level subsetting § The entire file, table or database is migrated to RD&T 2. Physical data subsetting § A specified # of rows, records or segments is migrated to RD&T 3. Logical data subset § Specific rows, records or segments that comprise a "logical record" are migrated to RD&T § A logical data subset is sometimes called a "logical record" 18
DASD Migration There is a simple client/server utility provided with RD&T that allows you to migrate 3380/3390 DASD volumes from z/OS to RD&T 4 The server portion (on the remote z/OS) reads all the tracks on a selected volume and sends them to the client (on the local base Linux). 4 The client transforms it into the emulated 3380/3390 format used by RD&T and writes it as a Linux file. You then use this file as an emulated volume under RD&T probably not active 19 Linux with RD&T installed
DASD Migration – Considerations – 1 of 2 § The server portion (on z/OS) requires specific RACF authorizations. § DASD Migration can copy active volumes, although the usefulness of the copy might be questionable, depending on the volume activity at the time. § The speed of the copies depends on the TCP/IP bandwidth between the client and server, and the contents of each track. 4 A considerable amount of data is involved on a typical 3390 volume; the transmission may take some time. 4 A restart function is provided to continue a failed transfer. 20
DASD Migration – Considerations – 2 of 2 § DASD Migration is a "Best Practice" for creating a cloned test data environment § Migrating VSAM user catalogs is possible. 4 You can connect the RD&T master catalog to the DASD migrated user catalogs § You can use DASD volume migration for: 4 QSAM files 4 IMS databases that are HDAM/OSAM (maybe others, I'll need to look into MSDB, DEDB, and Fast Path) § But you should not use DASD Migration for: 4 VSAM data - because their exist VSAM metadata and possibly catalog (VVDS) data sets referenced in the migrated volume 4 DB 2 tables - because their underlying physical data source is VSAM § There is an excellent IBM Redbook on the topic of DASD Migration (System z Personal Development Tool Volume 3 Additional Topics – SG 247723) 4 Additional information on running this utility is available therein 21
Managed Data Migration § Physical data subsetting 4 A specified # of rows, records or segments is: – Extracted from z/OS using utilities and tools – FTP'd from a z/OS LPAR to RD&T – Loaded into the RD&T environment using utilities and tools § There are many different approaches to the above, depending on: 4 The kind of data you need to migrate 4 The tools you own 4 The cooperation and involvement of systems technical staff: § DBAs § Systems Programmers § A high-level chart depicting options from 10, 000 feet is shown on the next slide 22
Migration Process Considerations § Migrate all objects needed in one unit 4 Primary work done by DBA group § Application teams use file and database editing tools for specific test data updates 4 Extends the current DBA/Application group model of most shops 4 Can be considered more efficient (economy of scale) § Migrate objects "as needed" by applications teams 4 Typically done by applications teams 4 More agile § No development delays due to required personnel unavailable 4 Developers made more aware of data content – with benefits to coding design and testing 4 Developers will need more training 23
"Logical Record" Concept § Refers to the notion of a subset of your production data sources, that contains all of the necessary elements and dependencies to provide unit test fidelity § Examples: 4 Insurance: Insured Policies Claim Payments Riders 4 Retail: Store Departments Items Prices (sale/standard price) Salesclerks Transactions § A "logical record" then, would consist of the minimum # of records/rows/segments necessary to provide data values for a valid Unit Test 4"Valid Unit Test" in this case to mean that there are no application errors or exceptions that occur due to invalid data values or relationships that would not exist in a production environment 24
How can I migrate a "Logical Record" § Three approaches: 1. DASD Migration 2. Migrate all records/rows/segments in a collection of related files/tables/databases 3. Migrating subsets of files and databases (both logical and physical subsets) § All three approaches will be covered in the remainder of this course 25
Summary of file types and migration strategies QSAM VSAM IMS DB 2 Other IDMS Datacom Tablebase Complete file N/A REPRO IMS Utilities OPTIM DB 2 Utilities RDz Data Tools OPTIM ? Physical subset of file REPRO OPTIM DITTO/ESA IMS Utilities OPTIM DB 2 Utilities RDz Data Tools OPTIM ? Logical test data subsetting required OPTIM DITTO/ESA OPTIM REPRO DITTO/ESA OPTIM ? Data masking required OPTIM ? Connect to data on z/OS Volume (DASD) migration CICS – Function Shipping X DRDA ? ? Not HIDAM databases 26 ? ?
What about DRDA and CICS Transaction Routing? § Placeholder slide - pending research 27
Topic objectives Having completed this topic, you should now be able to answer the following questions and define the following terms and concepts: 4 What is a valid test data environment for unit test? 4 What do you provision? 4 The concept of a logical record 4 Volume migration 4 Rational Asset Analyzer 4 Cloning production data 4 Protecting (masking) data values 4 Non-IBM databases § IDMS and IDMS/R - http: //idms-training. com/ § Datacom 4 Other (non-standard) IBM databases § Tablebase 4 Refreshing data 4 Data creation – using utilities and 3 rd party products 4 Accessing data on z/OS (through DRDA) 28
UNIT Managing Test Data Topics: § DB 2 Data Migration § QSAM/VSAM Data Migration § IMS DL/I Database Migration § Migrating CICS Transaction Metadata § Migrating IMS Transaction Metadata § Back-up Slides 29
Topic Objectives After completing this topic, you should be able to: 4 Describe the options and considerations for DB 2 test data migration and/or test data access 4 Migrate DB 2 DDL 4 Describe the use of the DB 2 DBA utilities 4 Describe the RDz Data Tool utilities: § Migrate DB 2 test data rows – All table rows – Table row subsets: – Physical (# of records) subset – Logical subset 4 Modify DB 2 table values on RD&T 4 Modify (ALTER) a DB 2 table 4 Drop and re-create a DB 2 table 4 DRDA access to DB 2 data You will want to access the following Power. Point deck: RDz Workbench – Using the Data Source Explorer. ppt. Available at: https: //www. ibm. com/developerworks/mydeveloperworks/wikis/home? lang=en#/wiki/W 2 e 35 a 50023 ef_4 b 39_a 867_04 fb 9 e 1 d 3329/page/Distance%20 Lear ning%20 Resources/attachments 30
DB 2 Data Migration Considerations – 1 of 2 § What DB 2 "objects" to migrate for Unit Test? 4 Tables § Table definitions § Primary/Foreign Key constraints 4 4 4 Stored Procedures Synonyms Views Indexes Grant statements – for access to the DB 2 objects … More than likely all of the above § Do you need to migrate DB 2 "physical" objects (Storage Groups, Tablespaces, Indexspaces)? 4 Possibly – it depends on whether your application's SQL access is dependent on table clustering 4 If you wish to migrate the physical objects you can do so, providing that you can create an extract file with the correct top-down CREATE and ALTER statements § DB 2 object DDL is simple ASCII text. It usually exists as one or more files with multiple SQL CREATE <…> statements, which can be run and committed/rolled back as a unit § Table data migration and the database design 4 To migrate the table rows, you must understand your database's logical design – in order to INSERT or load the rows in accordance with Referential Integrity constraints (Parent/Child table dependencies) § To understand the model you can use RDz's Data Tools' "Overview Diagram" § For more on referential integrity: http: //en. wikipedia. org/wiki/Referential_integrity 31
DB 2 Data Migration Considerations – 2 of 2 § Considerations for the scope of your DB 2 migration: 4 Cloning a DB 2 Subsystem: § Can migrate DASD volumes for DB 2 objects and DB 2 catalog system § Requires Systems Programmer/DBA skills and authorization 4 Database: § Some shops organize all tables for an application into a single database – A DB 2 Database is a logical entity. It is not a physical file like an Oracle or Sybase database 4 Specific DB 2 objects § Migrate selected tables used in application unit testing use cases 4 Logical Record – Parent/Child tables and rows selected from the Parent/Child tables § Determine lineage – based on the logical model – Key List: Collection of DB 2 columns that participate in defined Primary/Foreign Key relationships § Note that the "IBM DB 2 Tools" contains a product named: "Grouper" that effectively identifies related tables that comprise a business application. – Grouper can be used with the DB 2 Tools "Test Database Generator" to speed migration § Options for discovery and analysis of DB 2 table/Application usage include: 4 Interview Subject Matter Experts 4 Discovery through RAA 4 RDz Data Tools – used to understand the logical design 4 Catalog queries - out of scope for this course 32
Options For Migrating DB 2 Data 1. DB 2 utilities and/or add-on products 4 To use DB 2 Utilities you will need: § DBA authority – over the objects you wish to execute the utilities against § An understanding of how to execute the utilities – in other words, you probably need to be a DB 2 DBA § DBA library concatenation in your ISPF logon proc § Add-on tools that can be used for table migration include: § OPTIM Studio § DB 2 Tools for z/OS § 3 rd party products 2. RDz Tools 4 Data Tools – part of the Data perspective 3. DRDA access to DB 2 Data Except for the RDz tools, deep content on the above products is beyond the scope of this course. However an introduction to these options is presented later in this section 33
Steps For Migrating DB 2 Data 1. Identify and Analyze DB 2 objects for migration 1. 2. 3. 4. Utilizing Subject Matter Experts Using RAA Using RDz's Data Tools Using DB 2 System Catalog Queries 2. Migrate DDL for selected DB 2 objects 3. Migrate table data 34
1. 1 Interviewing Subject Matter Experts for DB 2 Object Identification § Discovery and data analysis questions such as: 4 What DB 2 objects (tables/views/synonyms/stored procedures, etc. ) does this batch job/transaction use § Note that DB 2 tables are not represented by DD cards in JCL) 4 What ancestor (parent) and dependent (child) tables exist for these tables? 4 Are the table relationships maintained by DB 2 Primary Key/Foreign Key constructs in the database DDL or through application program logic? § If application program logic, who owns the programs that can answer data migration questions? 4 Are there specific table-relationships that are supposed to exist from a business processing point of view, that are not defined through 4 What are the high-level schemas used to access the DB 2 objects through? – I. E. who are the DB 2 object owners (what are their TSO IDs)? 4 What % of a test table does this program read? 4 Do we need to migrate indexes – other than PRIMARY KEY indexes? § For SQL performance purposes 4 Do we need to define DB 2 Explain tables on UT? § Do we need to migrate Explain data (for historical analysis)? 35
1. 2 Using RAA for DB 2 Data Discovery and Analysis § DB 2 table usage lists, by: 4 Batch job 4 Transaction 4 Run unit: § The initial program called by a transaction § The program invoked by EXEC PGM= in run stream JCL 4 Individual program § DB 2 Stored Procedure usage by program § Scope is by source inventory collected: 4 Enterprise wide 4 Subset of enterprise source assets § Reports run off-host (Windows server) 4 Saves MIPS used during analysis § Simple web-based interface § Note that for CICS transactions (only) CICS Explorer – and specifically the Interdependency Analyzer produces similar usage analysis information as RAA 36
1. 2 RAA and Logical Record Discovery § Pre-requisite – must have imported the custom queries that ship with RAA § Batch job: 4 Search on Batch Job name 4 From Batch job summary – click on a batch job 4 From the Batch job details § Open Actions and Select + DB 2 Tables accessed by job's application programs § Note that you could also use the Batch job diagram § Repeat for CICS transactions 37
1. 2 RAA List of DB 2 Tables Accessed in a Batch Job Page – or increment "Show groups of" Note by clicking a hyper-linked Tablename, RAA produces a "where used" report 38
1. 3 RDz Data Tools for Discovery and Analysis § Integrated DB 2 tools , used for: 4 DB 2 object: § Discovery and analysis § Browsing 4 Table extract and load § We consider this a "best practice" for DB 2 data migration 4 SQL coding and testing 4 Understanding your data model § Access to: 4 Schemas No need for license purchase, Data Tools come bundled with RDz § And all dependent objects 4 DB 2 catalogs 39
1. 3 Discovering the Logical Model of your DB 2 Data using the Data Tools Steps described in detail: RDz Workbench – Using the Data Source Explorer. ppt. § From the RDz Data Perspective 4 Create a connection to your DB 2 System 4 Filter the connection by all of the schemas (DB 2 object owner IDs) you need § From your DB 2 connection: 4 Expand the connection 4 Right-click the connection icon, and select: Add to Overview Diagram 4 Inside the Overview Diagram § Check the DB 2 tables/views/synonyms that you wish to understand click: OK Note that – depending on the # of objects and complexity of relationship hierarchy, rendering the Overview Diagram could take time § Relationships diagramed from DB 2 -maintained referential integrity constructs 40
1. 3 Discovering Entity/Relationships Using the Overview Diagram § From the Overview Diagram: 4 Discover/Analyze your object dependencies 4 Consider: § Filters > § Show/Hide Compartments § All Compartments 4 Right-click and select: File > Save as Image File § Considerations: 4 Only DB 2 -maintained relationships (through defined Primary/Foreign key relationships) result in the Entity/Relationship model shown above 4 The Overview Diagram tool only has access to the Schema/Table names exposed in your connection (recall Schema filter when you created the connection) 41
1. 3 Discovering the Key List for a Logical Record Using the Overview Diagram To learn the details the "Key List" in a Logical Record do the following: § From the Overview Diagram: 4 Select each relationship line 4 From the Properties view, transcribe (write out) the key columns § The complete set of key columns is the Key List, which will be used in logical data subsetting 42
1. 4 Using DB 2 Catalog Queries § Requires DBA-level understanding of the DB 2 System Catalog Can use to generate a list of tables in a database See Slide Notes for sample system catalog queries 43
DB 2 Migration Steps 1. Identify and Analyze DB 2 objects for migration 2. Migrate DDL for selected DB 2 objects 1. Connect to DB 2 z/OS and generate DDL on z/OS 2. Save the DDL (optional) 3. Connect to DB 2 RD&T, and run the DDL to create the DB 2 objects on RD&T 3. Migrate table data 44
Options and Considerations for Generating DDL § Options include: 4 DB 2 DBA tools on z/OS § IBM tools: – DB 2 Tools – High Performance Unload – OPTIM § 3 rd Party tools – BMC, Candle, Platinum, etc. 4 RDz Data Tools § Individual DB 2 object § All – or selected DB 2 objects within a schema § DDL generated out of the current (up-to-date, complete) DB 2 system catalog § The DB 2 DBA tools (IBM and 3 rd Party tools) that run on z/OS are out of scope for this course: 4 Every shop has different products and processes 4 Running the tools requires DBA: § Authority over the resources § Skills 45
2. 1 Using RDz to Generate DDL – 1 of 6 § From the Data Perspective – and Data Source Explorer connected to your z/OS DB 2 sub-system: 4 Open a connection 4 Expand the object list in the Data Source Explorer 4 Depending on your connections scope: § Select the entire list of objects § Or expand the list of objects and select specific DB 2 objects 4 Right-click and select Generate DDL… 46
2. 1 Using RDz to Generate DDL – 2 of 6 § Un-check and check the options you want to include as generated DDL 4 Both the Generation options and which DB 2 objects you want DDL for… Generates DDL 47
2. 1 Using RDz to Generate DDL – 3 of 6 § Save and modify the generated DDL 4 After generation finishes save the DDL to a file in your workspace § Depending on how many objects, how many dependencies and how large the schema for the objects is – this could take awhile § If you're ready to continue working, check the Open DDL box and click: 4 Next > 4 Finish 48
2. 1 Using RDz to Generate DDL – 4 of 6 § From the Script Editor modify the DDL created as per your target system. For example: 4 Considerations for the target DB 2 subsystem (might be necessary): § Change the WLM ENVIRONMENT name § Change the BUFFERPOOL # § Change SCHEMA name § Mandatory: 4 Remove All - for the following strings: § FOR SBCS DATA § PIECESIZE 4194304 K § Check for any "red syntax error" messages in your script editor 4 And fix all problems before continuing § Save your script changes 49
2. 2 Using RDz to Generate DDL – 5 of 6 § If you have a relatively simple DB 2 subsystem to migrate you can: 4 Copy and paste your completed work into a SQL script from a connection to RD&T 4 Run the script 4 Validate the results 4 Return, make any corrections and re-run § However – you will need to follow additional steps (we'll call this an "Advanced Migration Process") for any of the following: 4 Your tables contain self-referencing Foreign Keys § Examples: Parts dependent on other Parts (Bill of Materials), or Employees reporting to other employees with a "Reports_To" field as a Foreign Key, etc. 4 You will be loading the tables using z/OS DB 2 DBA utility unload scripts 4 You will be loading tens of hundreds of thousands of rows § We will cover the Advanced Migration Process on an upcoming slide 50
2. 2 – 2. 3 Using RDz to Generate DDL – 6 of 6 § From your script – Press: Ctrl+A (to select all lines) then Ctrl+C (to copy) § From the Data Source Explorer connect to your RD&T DB 2 sub-system: 4 Open a SQL Script 4 Paste the copied lines 4 Run the SQL § Or press F 5 § Optionally, you may want to save your finished DDL file 51
DB 2 Migration Steps – 3 Migrate Table Data 1. Identify and Analyze DB 2 objects for migration 2. Migrate DDL for selected DB 2 objects 3. Migrate DB 2 table data 1. (If doing logical data subsetting "Logical Record") - Using the table relationships, 2. 3. 4. 5. discover the Key List for the tables participating in the Data Migration Select your data migration strategy 1. Cloned DB 2 subsystem 2. Complete table (all rows) 3. Physical (# of rows) subset 4. Logical subset ("Logical Record") Note that you different tables can be migrated using different strategies Generate the table data – options include: § RDz data tools extract § DB 2 Utilities § 3 rd Party Utilities Migrate the table data to RD&T 4 Unless using RDz – in which case there is no data migration step needed Load the table rows on RD&T 52
Options and Considerations for Migrating DB 2 Data DB 2 Utilities 4 To use the DB 2 Utilities you will need: § DBA authority – over the objects you wish to execute the utilities against § An understanding of how to execute the utilities (in other words, you probably need to be a DB 2 DBA) § DBA library concatenation in your ISPF logon proc RDz Tools 4 Data Tools – part of the Data perspective Add-on tools used for table migration 4 4 DB 2 Tools for z/OS OPTIM Studio File Manager (green screen version) 3 rd party products The above products are beyond the scope of this course – however a list of them is provided at the end of this section 53
Migrating DB 2 Data Using RDz – Complete Table or Physical # of Rows § § RDz's Data Tools make migrating a complete table and/or a subset of the table fast and easy. Steps – from the Data Perspective: 1. From your z/OS connection - from the SQL Query tool, code a SELECT * FROM <table> - Optionally use a WHERE clause to subset rows 2. From the SQL Results view: 1. Right-click over the rows, and select EXPORT > All Results 2. Specify: 1. Plain Text 2. Output format: User Defined 3. Delimiter – something that won't be in the data – ex. | 3. From your RD&T DB 2 connection 1. Right-click over the table, and select: Data > Load… 2. Specify the output extract file 3. Verify the operation in the SQL Status view – and by opening the table for Edit § Note that to migrate a specific number of rows: 4 From Preferences > Data Management > SQL Development > SQL Results View Options, set the Max display count: to your # or rows threshold 54
Migrating DB 2 Data Using RDz – Migrating a Logical Record § In DB 2 database designs, tables are related through a series of Primary and Foreign Keys § Previously (and as a byproduct of understanding the physical design of our data) we learned how to transcribe the primary and foreign keys using a combination of the RDz: 4 Overview Diagram 4 Properties view/Details § In doing this we were working at a logical level of abstraction. Now we must select and then utilize the actual key values in the tables, in order to extract the rows necessary to create a logical record for use in our testing § There are multiple ways of doing this, but we are partial to the following approach to creating a logical record: 3. 1 Using the Key List from the logical model, create a single table join of all tables that participate in a parent/child dependency tree 3. 2 Save the result values of this join 3. 3 Use the saved join result set key values in additional SELECT statements against individual tables 3. 3 Exporting the individual SELECT result set rows 3. 3 Load the tables in RD&T from the exported result set rows 55
3. 1 – Create a Join of All Logically Related Tables § There are several ways you can do this step. Including hand-coding the SQL join § However, we believe that this use case is actually an excellent candidate for graphic SQL (the SQL Query Builder) which is described in the Powerpoint: RDz Workbench – Using the Data Source Explorer. § Steps: 4 Open the Overview Diagram – which documents the logical model. Note that you will use the Properties view/Details to expose the Key List 4 From the Data Project Explorer, create a new Data Design Project 4 From the Data Design Project, create a new SQL Script, Select statement using the SQL Query Builder 4 Using the Query Builder: § Add all of the tables to the query documented in the Overview Diagram § Add all of the table joins necessary to correctly relate all of the tables – obtain this information from the links in the Overview Diagram and Properties/Details view § If the Properties view shows additional join criteria (through multi-column keys) you can either additional WHERE clause criteria into your query manually, or using the SQL Query Builder. § Run the statement 56
3. 1 – Open the Overview Diagram § Note the tables and their relationships § Use the Properties view to ascertain keys 57
3. 1 Code the SQL to Join the Tables § Use the SQL Query Builder to create the Key List § Note the two added WHERE clauses for the multi-column join between EMPPROJACT and PROJACT § Select only key values from the tables. Do not SELECT * § Save (Export) the SQL to a text file 58 Results
3. 2 Use the Exported Key Values in SELECT Statements to Generate Export Rows § Use the SQL Script editor to create SELECT * statements for all of the tables that are participating in the data migration § Code or copy/paste the Key List values from the saved text file into the WHERE clauses 59
3. 2 Use the Exported Key Values in SELECT Statements to Generate Export Rows § Export all of the SELECT * row values to: Plain Text, Output format: User-defined. § Choose a Delimiter value that can't exist in the data rows (I've selected the | pipe character) 60
3. 2 Optional – Check out the Exported Row Values § Open any of the export *. txt files § Note that you could edit the data in plain text § Although you must take care to understand the data format, delimiting, and especially referential integrity (Foreign Key values) 61
3. 3 Load the Migrated Tables From the Save SQL Results Files § Connect to your RD&T database § Right-click on a table and Load. Select the same Column delimiter you chose for Export 62
Considerations and Limitations using the RDz Data Tools for Migration § Graphical data cannot be saved to plain text 4 You will have to use a DB 2/DBA product to migrate tables with columns of complex datatypes such as: Blob and Clob § You can use DB 2 Views to create the SQL Results rows used in migration 4 Often such views already exist to materialize multi-table joins § You don't have to join all of the tables that are related in a parent/child/grandchild dependency chain 4 You can work iteratively, from the top of the related-table-chain downwards 4 This may require some manual transcription of key values § If you have specific key values that you know are necessary or useful to your testing, you can add those into the initial join(s) at step 3. 1 § The one key concern is that you must load the tables in the same hierarchical sequence built into the data model 63
DB 2 DBA Utilities and Data Tools – Short Description § Products: 4 OPTIM Test Data Management Tools 4 DB 2 Data Tools 4 HPU (High Performance Unload) § DB 2 Utilities: 4 COPY/UNLOAD/LOAD 4 DSNTIAUL 64
OPTIM Test Data Management – Solution for z/OS § IBM offers a product named: OPTIM that provides a comprehensive wizarddriven process that walks you through the process described in the previous steps in a formal structured fashion: 4 With an enhanced user interface – that both describes the migration deliverables and simplifies the practice of doing this 4 And has no limitations or restrictions on DB 2 datatypes § Please see the following web-based information on OPTIM: http: //www-01. ibm. com/software/data/optim/streamline-test-data-management/ 65
DB 2 Data Tools for z/OS § IBM also offers a comprehensive set of DBA tools that – while not specifically created for test data management (like the OPTIM solution) can be used effectively to simplify migrating DB 2 tables and DB 2 objects to RD&T § From this considerable suite of tools, consider: 4 DB 2 Grouper – to determine logically related subsets of tables 4 DB 2 High Performance Unload 4 DB 2 Cloning Tool § Note that traditionally, these tools require DBA experience and authorization level § Link for DB 2 Data Tools: http: //www-01. ibm. com/software/data/db 2 imstools/products/db 2 -zos-tools. html 66
DSNTIAUL – for Unloading DB 2 Table Data § Finally – IBM ships a utility program (DSNTIAUL) that can be used effectively to unload data from DB 2 tables (it doesn't manage the migration of Indexes/Views/Synonyms, etc. – but you can use the RDz Data Tools for that § DSNTIAUL is: 4 Free 4 Relatively simple to use 4 Allows you to unload CLOB/BLOB tables (see the documentation on extended JCL for this) 4 Not as efficient as the Data Tools High Performance Unload … but is 4 Available for use by any application developer who has been granted RUN on the PLAN used to bind DSNTIAUL § See next slide for sample DSNTIAUL JCL § Link to IBM documentation on using DSNTIAUL 4 http: //publib. boulder. ibm. com/infocenter/dzichelp/v 2 r 2/index. jsp? topic=%2 Fcom. ibm. db 2 z 9. doc. ugref%2 Fsrc%2 Ftpc%2 Fdb 2 z_dsntiaul. htm 67
DSNTIAUL Sample JCL //UNLOAD EXEC PGM=IKJEFT 01, DYNAMNBR=20 //SYSTSPRT DD SYSOUT=* //SYSTSIN DD * DSN SYSTEM(DSN) RUN PROGRAM(DSNTIAUL) PLAN(DSNTIAUL) PARMS('SQL |, 250') – LIB('DSN 810. RUNLIB. LOAD') //SYSPRINT DD SYSOUT=* //SYSUDUMP DD SYSOUT=* //SYSREC 00 DD DSN=DSN 8 UNLD. SYSREC 00, // UNIT=SYSDA, SPACE=(32760, (1000, 500)), DISP=(, CATLG), // VOL=SER=SCR 03 //SYSREC 01 DD DSN=DSN 8 UNLD. SYSREC 01, // UNIT=SYSDA, SPACE=(32760, (1000, 500)), DISP=(, CATLG), // VOL=SER=SCR 03 //SYSPUNCH DD DSN=DSN 8 UNLD. SYSPUNCH, // UNIT=SYSDA, SPACE=(800, (15, 15)), DISP=(, CATLG), // VOL=SER=SCR 03, RECFM=FB, LRECL=120, BLKSIZE=1200 //SYSIN DD * LOCK TABLE DSN 8810. EMP IN SHARE MODE; LOCK TABLE DSN 8810. PROJ IN SHARE MODE; SELECT * FROM DSN 8810. PROJ; SELECT * FROM DSN 8810. EMP WHERE WORKDEPT LIKE 'D%' ORDER BY EMPNO; Note that your library and (probably) Plan name will be different from the above example 68
DB 2 DBA Utilities § Along the lines of the Data Tools ( but free ) IBM ships a large number of DB 2 utilities – a few which can be effectively used in an RD&T data migration strategy § Specifically: 4 UNLOAD 4 LOAD § The DB 2 DBA Utilities are: 4 Free 4 Not application-developer level processes – they require: § DB 2 DBA skills/experience § DB 2 DBA authorization 4 Able to unload CLOB/BLOB tables 4 Not as efficient as the Data Tools High Performance Unload … but (they're more efficient than DSNTIAUL) § This link contains background information on the utilities: 4 http: //www-01. ibm. com/software/data/db 2 imstools/db 2 utilsuite/ 69
Partial list of 3 rd Party Products for DB 2 Migration § IBM 4 OPTIM Data Studio - http: //publib. boulder. ibm. com/infocenter/idm/v 2 r 2/index. jsp? topic=/com. ibm. datatools. dsa. nav. doc/topics/helpindex_dsa_s df. html 4 DB 2 Tools for z/OS - http: //www-01. ibm. com/software/data/db 2 imstools/products/db 2 -zos-tools. html 4 File Manager (RDz/FM plugin and green screen) http: //publib. boulder. ibm. com/infocenter/idm/v 2 r 2/index. jsp? topic=/com. ibm. datatools. dsa. nav. doc/topics/helpindex_dsa_sdf. html § Note that there are many instances where you can utilize F. M. on RD&T. Please check with your IBM representative – Based on existing mainframe entitlement usage of FM can be evaluated. Note: Some requests may not be approved. Requires special licensing dispensation from IBM, not automatically entitled with purchase of RD&T. § EMS SQL Manager - http: //www. sqlmanager. net/en/products/db 2/manager § Compu. Ware – File-Aid for DB 2 - http: //www. compuware. com/mainframe-solutions/file-aid-datamanagement. html 70
DB 2 and DRDA Access to DB 2 on z/OS § As opposed to migrating DB 2 data, you can use DRDA to access the tables and other DB 2 objects on z/OS – from applications running on RD&T § This (obviously) makes the migration steps unnecessary, however it requires system setup and configuration work on RD&T – most likely involving your MVS System Programming staff § There is excellent documentation on DRDA available from various DB 2 Solution Center sources http: //publib. boulder. ibm. com/infocenter/db 2 luw/v 9 r 7/index. jsp? topic=/com. ibm. db 2. luw. qb. dbconn. doc/c 0004776. html § We will continue to work through the DB 2 data migration use cases in this section. 71
Cloning a DB 2 Subsystem § In the previous section we discussed the benefits (and concerns) of cloning an entire testing system § DASD Migration was proposed as the best practice for mass data migration § For DB 2 cloning, the only caveat is that you must migrate the VOLSERs for the catalog and the VOLSERs for user DB 2 tables DB 2 System Catalog On RD&T DB 2 System Catalog VOLSER with DB 2 table data on RD&T VOLSER with DB 2 table data Linux RD&T DASD Migration Utility z/OS 72
Summary of migration strategies for DB 2 data Migration Requirement DDL Migration Test Table Data Migration Clone of DB 2 Subsystem DASD migration Complete Database DB 2/DBA products and utilities – including HPU and OPTIM Data Tools Catalog queries to generate a list of tables in a database – RDz Data Tools to migrate specific tables DB 2/DBA products and utilities – including HPU and OPTIM Data Tools Specific tables DB 2/DBA products and utilities – including HPU and OPTIM Data Tools RDz Data Tools Logical Record DB 2/DBA products and utilities – including HPU and OPTIM Data Tools RDz Data Tools Data masking required Access to DB 2 data on z/OS RDz Data Tools to migrate specific tables OPTIM DRDA 73
UNIT Managing Test Data Topics: § DB 2 Test Table and Data Migration § QSAM/VSAM Data Migration § IMS DL/I Database Migration § Migrating CICS Transaction Meta Data § Migrating IMS Transaction Meta Data § Back-up Slides 74
Topic objectives After completing this topic, you should be able to: 4 Describe the options for DB 2 test data migration and/or test data access 4 Identify the documentation on the methods for test data migration/management 4 QSAM test data migration: § All records § Record subsets: – Physical (# of records) subset – Logical subset 4 VSAM test data migration: § All records § Record subsets: – Physical (# of records) subset – Logical subset § For one file § For a set of files – identified by pattern matching 4 Function Shipping to VSAM through CICS You will want to access the following Power. Point deck: RDz Workbench – Using Remote Systems Explorer. ppt. Available at: https: //www. ibm. com/developerworks/mydeveloperworks/wikis/home? lang=en#/wiki/W 2 e 35 a 50023 ef_4 b 39_a 867_04 fb 9 e 1 d 3329/page/Distance%20 Lear ning%20 Resources/attachments 75
VSAM/QSAM Data Migration Considerations § Considerations for the scope of your VSAM/QSAM migration: § What VSAM/QSAM files to migrate for Unit Test? 4 Cloning a test subsystem: § Can migrate DASD volumes for VSAM/QSAM files and the VSAM user catalog system § Requires Systems Programmer skills and authorization 4 Specific VSAM/QSAM files § Migrate selected files used in application unit testing use cases – – QSAM files can be dragged/dropped from z/OS to RD&T using RDz. VSAM files must be Repro'd to QSAM, and then dragged/dropped Libraries can be FTP'd or XMIT'd directly to RD&T Both QSAM/VSAM can be FTP'd to RD&T 4 Subsetting using REPRO for VSAM/QSAM § Physical record subsets § Logical Record subsets – Parent/Child files and records selected from the Parent/Child files – Determine lineage – based on "application maintained R. I. " § Options for discovery and analysis of VSAM/QSAM table/Application usage include: 4 Interview Subject Matter Experts and/or read the code: § JCL § Program listings 4 Discovery through RAA 4 RDz – used to understand the logical design 76
Options For Migrating VSAM/QSAM Data 1. Repro for VSAM and QSAM files § Only necessary for QSAM if subsetting file 2. IEBCOPY or XMIT or FTP for library datasets 3. Add-on tools that can be used for table migration include: § OPTIM Test Data Management Solution § VSAM tools for z/OS – such as FLASHCOPY § 3 rd party products for VSAM/QSAM migration 4. RDz Tools 4 Filters and the Outline view – for identification and analysis 4 Drag & Drop of QSAM files – with 8. 0. 3 (client & server) 5. DRDA access to VSAM Data 6. z/OS XMIT/RECEIVE – of datasets, as an alternative to RDz Drag & Drop 7. RD&T Data Tools 4 Ditto for VSAM – see slides in this section § Note that Ditto is a product name, not a catch-phrase Deep content on the add-on tools is beyond the scope of this course. However an introduction to these options is presented later in this section 77
Steps For Migrating VSAM/QSAM Data Prereq. - Identify and Analyze VSAM/QSAM objects for migration § § § Utilizing Subject Matter Experts Using RAA Using RDz product analysis features On z/OS LPAR 1. Obtain create dataset JCL (VSAM/GDG DEFINE) for VSAM/QSAM file 4 Note that this is considerably more complicated for KSDS files with alternate indexes 2. Repro files § § For all VSAM data For any QSAM files you wish to physically subset 3. Transfer files to RD&T § Use RDz Drag & Drop …or…FTP for mass file transfer. Transfer: § File create / VSAM DEFINE JCL § Repro output datasets 4. Clean up (delete repro datasets) On RD&T LPAR 1. Run create dataset (VSAM/GDG DEFINE) JCL 2. Repro VSAM/QSAM data back into new datasets 3. Clean up (delete repro datasets, and/or create dataset JCL) 78
1. 1 Interviewing Subject Matter Experts for VSAM/QSAM File Identification § Discovery and data analysis questions such as: 4 What VSAM/QSAM files § Note that we're only interested in INPUT files: – Not temporary/passed datasets generated by application – Not output files 4 What ancestor (parent) and dependent (child) or related (sibling) files exist for the input files: § If you're looking to logically subset 4 Are there specific table-relationships that are supposed to exist from a business processing point of view, that are not defined through 4 What are the high-level schemas used to access the VSAM/QSAM objects through § I. E. who are the VSAM/QSAM object owners (what are their TSO IDs)? 4 What % of a test file does this program access? 79
1. 2 Using RAA for VSAM/QSAM Data Discovery and Analysis § VSAM/QSAM table usage lists, by: 4 Batch job 4 Transaction 4 Run unit: § The initial program called by a transaction § The program invoked by EXEC PGM= in run stream JCL 4 Individual program § Scope is by source inventory collected: 4 Enterprise wide 4 Subset of enterprise source assets § Reports run off-host (Windows server) 4 Saves MIPS used during analysis § Simple web-based interface 80
1. 2 RAA and Logical Record Discovery § You can use Rational Asset Analyzer to effectively pinpoint files that need to be migrated to RD&T § Batch job: 4 Search on Batch Job name 4 From Batch job summary – click on a batch job 4 From the Batch job details – check out the DDs tab § A similar RAA function exists for VSAM/QSAM files used in CICS: 4 Regions – Look in the CICS files tab 4 Programs – Look in the Data sets tab 81
1. 2 RAA List of VSAM/QSAM Files Accessed in a Batch Job Page – or increment to "Show groups of" Note also that by clicking a hyperlinked DSN, RAA produces "where used" reports 82
1. 3 RDz Data Tools for Discovery and Analysis § Two functions (with a JCL file open in LPEX): 4 Outline view 4 Filter view 83
VSAM/QSAM Migration Steps 1. Identify and Analyze VSAM/QSAM objects for migration 2. Migrate selected VSAM/QSAM datasets § On z/OS 1. 2. 3. 4. § Code JCL to Repro file Run Repro job – check return code FTP Repro-file to RD&T Delete Repro-file On RD&T 1. 2. 3. 4. 5. Code JCL to create new QSAM dataset, or DEFINE VSAM Cluster or GDG Model Run JCL – check return code Code JCL to Repro data file from z/OS to new QSAM/VSAM file Run Repro job – check return code Delete Repro-file Note: 4 The above Migration steps: § Are different for migrating library (PDS) datasets § Are more complex, if migrating a VSAM file with alternate indexes § Can be automated – to a large degree – by utilizing an IBM-supplied toolkit, in format of two REXX Execs and a manual editing process 84
Options and Considerations for VSAM/QSAM Data Migration Options include: 4 IBM OPTIM Test Data Management for z/OS 43 rd Party tools for migrating VSAM/QSAM files § Note that both the OPTIM and 3 rd Party tools often require MVS systems programmer level: – Skills – Authorization 4 RDz 8. 0. 3 file copy/paste process § Can be used to selectively copy QSAM (not VSAM) files from your z/OS LPAR to RD&T – Sequential datasets – Individual members of a library § Note that Dragging and dropping QSAM files requires RDz version 8. 0. 3 installed on your client machine, and on the z/OS and RD&T LPARs (servers) 4 JCL/Repro/FTP process § A nine step process that can be used to subset files and move them to RD&T § IBM is supplying a toolkit that allows you to automate these steps – and is especially helpful in moving large #s of VSAM/QSAM files to RD&T 85
REXX Execs For VSAM / QSAM Migration Two REXX Execs and a manual step that help simplify and automate the process of migrating multiple VSAM/QSAM files from a z/OS LPAR to RD&T. The REXX Execs support the following file types: 4 QSAM: § Sequential datasets § Partitioned datasets 4 VSAM: § § § GDG Models (and GDG datasets) ESDS RRDS KSDS Alternate Index (AIX) 4 Migrated datasets § Prerequisite – you will need to create a simple and small JCL PDS on both RD&T and your z/OS LPAR 86
REXX Execs For VSAM / QSAM Migration – Process (Overview) The steps are as follows: 1. Run the 1 st REXX Exec (MAKELIST), passing it a partial DSN for your files MAKELIST - essentially a LISTC LEVEL(HLQ. QUAL) The REXX Exec will produce a file containing a list of DSNs along with their DCB attributes 2. Edit the file produced by the REXX Exec in RDz: § § 3. Remove (delete) DSNs you don't want migrated from z/OS to RD&T For each dataset to be migrated, optionally qualify the number of records you want to be copied over to RD&T Run the 2 nd REXX Exec (CREATJCL), passing it: CREATJCL 4 The RD&T volume name (VOL=SER) you wish to create the data sets in 4 The IP Address of your RD&T image 4 The high-level qualifier of receiving ID (that will you'll bet FTP'ng the temp Repro files to) This REXX Exec will – based on your edited file of DSNs - produce a set of JCL jobs that you can submit which automates the process of migrating the datasets listed in the file 4. Submit the JCL jobs produced in step #3 - this will: § § § Generate Repro datasets for your QSAM and VSAM files Generate "Create dataset" JCL and IDCAMS Define statements FTP your Repro datasets and create statements to RD&T You will run the jobs in a specific order (see slide on that) so that GDG models and AIX VSAM files are created correctly 5. Log in to RD&T to run the FTP'd jobs which: 4 Create the QSAM / VSAM files and GDG models on RD&T 4 Repro the datasets into your QSAM / VSAM files on RD&T 4 Delete the Repro datasets 87
REXX Execs For VSAM / QSAM Migration – Step 1 of 5 – Create DSN List MAKELIST § Input: 4 Menu Manager(? ) – prompt for dataset name pattern: LISTC LEVEL(qualification) § Process 4 If no files exist by that name, send back error message and exit, else continue 4 Iterate over list of DSNs that match the dataset name pattern – for each dataset: § Issue LISTDSI § Iterate over LISTDSI output: – If QSAM – WRITE DSN and file attributes (RECFM LRECL BLKSIZE, etc. ) – If VSAM – ISSUE LISTC ALL('DSN') – Iterate over result file – to write out the VSAM file DSN and attributes necessary to recreate the dataset on RD&T – If GDG Model – Write the DSN to allow the process Exec to generate a GDG model statement § Output: 4 List dataset (see next slide for layout) 4 Menu Manager output with # of QSAM files/VSAM files – not available in beta release 88
Structure of REXX For VSAM / QSAM Migration – Step 2 of 5 – Edit the DSN List Edit the datasets DSN list 4 Delete Datasets not wanted/needed to be copied down to UT 4 Update the information according to your file migration requirements: § # of records to be Repro'd for each file – If left at 00000 will Repro all records in a dataset § Key range – for VSAM files – not supported in the beta 89
Structure of REXX For VSAM / QSAM Migration – 3 of 5 - Process the DSN List CREATJCL § Inputs: 4 4 Input Dataset name – not in beta REPRO Output High-level DSN Qualifier Volume FTP parameters § Process – from z/OS LPAR run the REXX Exec that: 4 Reads DSN list dataset. Iterate over records 4 For each file: § Creates individual JCL PDS members to migrate to RD&T – – – JCLA – for IDCAMS AIX datasets JCLB – for IDCAMS BLDINDEX statement JCLD – to delete all temporary Repro datasets produced in the process JCLFC – to FTP all job steps over to RD&T (replaced by Drag & Drop in beta) JCLFI – to FTP all Repro QSAM files over to RD&T JCLG – for IDCAMS GDG model create statements JCLI – for FTP statements to move libraries (PDSs) over to RD&T JCLIUT – not used JCLP – JCLB – for IDCAMS PATH statement JCLR – for IDCAMS Repro statements to Repro the z/OS datasets to temporary QSAM files JCLRUT – for IDCAMS Repro statements to Repro the temporary QSAM files back to the original datasets on RD&T JCLV – for IDCAMS KSDS/ESDS/RRDS datasets § Output: 4 Set of JCL members (see above) 4 Statistics: § Number of files processed (not in beta) 90
Structure of REXX For VSAM / QSAM Migration – Steps 4 and 5 – Run the Batch Jobs System Processing PDS member to run z/OS LPAR Create REPRO (unload) on z/OS JCLR FTP of all QSAM (REPRO) datasets JCLFI Drag & Drop the PDS members to a library on RD&T LPAR JCLG Create VSAM datasets JCLV Create QSAM datasets JCLQ Run Repro Reload VSAM and QSAM datasets JCLRUT Create AIX datasets JCLA Create PATH datasets JCLP Run BLDIDX JCLB Run Delete datasets z/OS LPAR Create GDG first to build the GDG Models: JCLD Run JCLI – to FTP all library datasets JCLI Run Delete datasets JCLD In the above order, due to dependencies in the VSAM dataset creation process for GDGs and AIX (alternate index) files Between JCL job submissions you should check the step return codes in JES – on both z/OS and RD&T 91
REXX Execs - Considerations § You must have QSAM/VSAM file read authorization – for all datasets you wish to migrate (Repro) using this approach § The FTP JCL requires you to setup a //NETRC dataset – with an internal file format as follows: 000001 MACHINE nnn LOGIN TSO-ID PASSWORD Password Boilerplate text RD&T IP Address RD&T TSO Logon ID RD&T TSO Password for Logon ID § The TSO-ID you use to login to RD&T with, will be the same High-Level Qualifier you pass to the process REXX Exec, as the output High-Level DSN. 4 This is assuming you wish to take advantage of the FTP automation produced by the Exec. 4 For example, if you have RDz 8. 0. 3 installed (on the client and server of both your z/OS LPAR and RD&T), and wish to manually drag & drop the Repro datasets that is another option 92
z/OS XMIT/RECEIVE of Datasets Traditional method of moving data from one LPAR to another in the same network 4 Benefits: § Good for individual or small number of files § Can be done from – TSO Ready mode – ISPF =6 § Fast and efficient – Does not tie up RDz client machine (as opposed to RDz Drag & Drop, XMIT goes directly from z/OS to Linux box) 4 Drawbacks: § Manual effort § One file/per XMIT § Can use JCL: – Through executing TSO (IKJEFT 01) – Note that there is an XMIT statement in JCL, but this approach is limited to 80 column card data 93
Migrating VSAM/QSAM Data Using RDz – Migrating a Logical Record § In VSAM/QSAM technology, associated files are related through application (program) logic 4 This means that there is no "catalog" and no system-maintained referential integrity controls that you can depend on to understand or utilize for logical data subsetting 4 With VSAM (KSDS datasets) you read an input record from FILE 1 then find the associated record using an index 4 With QSAM you sort input files based on key data, and read them in a master/detail logic sequence § Because of this, you have two options for creating and migrating the equivalent of a DB 2 Logical Record in VSAM/QSAM: 4 For VSAM you can use the FROMKEY/TOKEY extensions to the REPRO operation (see example) to select specific records from a dataset REPRO INFILE (INFILE) OUTFILE (OUTFILE) FROMKEY (123456) TOKEY (654321) 4 For QSAM you will have to: § Repro the data § Use a Data File editor (such as the RDz 8. 0. 3 file editor) to custom-edit specific values in your dataset 94
Partial list of 3 rd Party Products for VSAM/QSAM Migration § IBM 4 OPTIM Test Data Management Solution 4 VSAM/QSAM Administration Tools for z/OS § http: //www-01. ibm. com/software/data/VSAM/QSAMimstools/products/VSAM/QSAM-zos-tools. html 4 File Manager – part of the Problem Determination Tools suite: § http: //www-01. ibm. com/software/awdtools/deployment/ § Mac. Kinney Systems – ISPF VSAM Utility 4 http: //www. mackinney. com/products/data-management/ispf-vsam-utility. html § Computer Associates 4 CA FAVER™ 2 § http: //www. ca. com/~/media/Files/Product. Briefs/faver 2_product_brief. pdf 4 File-Aid for VSAM/QSAM § http: //www. compuware. com/mainframe-solutions/file-aid-data-management. html § Software Engineering of America – Fastpack product line 4 http: //www. seasoft. com/dasd-dataset-management. asp 95
Summary of migration strategies for QSAM/VSAM data QSAM VSAM Complete file REPRO OPTIM Physical subset of file REPRO OPTIM Logical test data subsetting required OPTIM REPRO with FROMKEY/TOKEY OPTIM Data masking required OPTIM Direct to z/OS Data from RD&T Volume (DASD) migration Function Shipping X ? 96
UNIT Managing Test Data Topics: § DB 2 Test Table and Data Migration § QSAM/VSAM Data Migration § IMS DL/I Database Migration § Migrating CICS Transaction Meta Data § Migrating IMS Transaction Meta Data § Back-up Slides 97
Topic objectives After completing this topic, you should be able to: 4 Describe how to use these course materials 4 Differentiate between the two types of slide topics in the learning modules 4 Navigate to additional learning resources for both: – RDz – COBOL 98
Migrating IMS Databases § IMS database migration §No need for "physical objects" - Tablespaces § DB 2 table rows § Must understand Physical Design 99
UNIT Managing Test Data Topics: § DB 2 Test Table and Data Migration § QSAM/VSAM Data Migration § IMS DL/I Database Migration § Migrating CICS Transaction Metadata § Migrating IMS Transaction Metadata § Back-up Slides 100
Migrating CICS Transaction Meta Data See slide notes 101
UNIT Managing Test Data Topics: § DB 2 Test Table and Data Migration § QSAM/VSAM Data Migration § IMS DL/I Database Migration § Migrating CICS Transaction Metadata § Migrating IMS Transaction Metadata § Back-up Slides 102
Migrating CICS Transaction Meta Data See slide notes 103
Migrating DB 2 Objects § DB 2 "object" DDL 4 Tables § Table definitions § Primary/Foreign Key constraints 4 4 Synonyms Views Indexes GRANTS § No need for "physical objects" - Tablespaces § DB 2 table rows § Must understand Physical Design 104
UNIT Managing Test Data Topics: § DB 2 Test Table and Data Migration § QSAM/VSAM Data Migration § IMS DL/I Database Migration § Migrating CICS Transaction Metadata § Migrating IMS Transaction Metadata § Back-up Slides 105
FTP Details FTP STEPS 4 CREATE EXTRACT 4 FROM TARGET LOCATION: § § § CONNECT DEMOMVS. DEMOPKG. IBM. COM USERNAME/PWD BIN LOCSITE RDW GET <NEWFILENAME> (REPLACE I don't specify BIN, and specify quote lrecl=600 recfm=vb blksize=0 Check out MPUT for doing Library datasets § http: //billlalonde. tripod. com/back/mvsh 019. htm § http: //billlalonde. tripod. com/back/mvsh 023. htm § http: //billlalonde. tripod. com/back/mvsh 007. htm § http: //publib. boulder. ibm. com/infocenter/zos/v 1 r 11/index. jsp? topic=/com. ibm. zos. r 11. f 54 sg 00/isp zsg 80166. htm LMDLIST examples for tools § § FTP tips http: //www. mainframetips. com/pc-%E 2%80%93 -mainframe-zos-file-transfer/ On FTP'ng VSAM files http: //www. mail-archive. com/ibm-main@bama. ua. edu/msg 92374. html 106
§ //GFTPG 10 JOB (GENPUTS), ’BIOCHIM’, // USER=, // GROUP=, // PASSWORD=, // TIME=1440, // NOTIFY=&SYSUID, // REGION=8 M, // CLASS=A, // MSGCLASS=X, // MSGLEVEL=(1, 1) // EXEC PGM=FTP, PARM=’ 10. 150. 0. 78 (EXIT’ //SYSPRINT DD SYSOUT=* //OUTPUT DD SYSOUT=*, DCB=(RECFM=FB, LRECL=160, BLKSIZE=3200) //INPUT DD * OPERATOR SENDSITE LOCSITE SBD=BCA 1 VTF. FTP. TCPXLBIN CD /KCI 01 CD B 6600 CD HDESK CD ARCH PUT ‘PRINT 01. KCI 01. B 6600. HDESK. HISGLACC. G 10. RC 1′ HISGLACC. G 10 CLOSE QUIT //* 107
4920b61ff038e6490268612a19a6c3bf.ppt