
d1c1aaf8447bbf015585a227e84a6808.ppt
- Количество слайдов: 42
Information Management Capacity Check TOOL AND METHODOLOGY
Table of Contents Page Overview of IM Capacity Check Tool Background Intellectual Property Objectives Key Characteristics Concept Elements and Criteria Principles for developing Elements and Criteria Elements of IM Capacity Check Tool Element Descriptions Level Descriptions Overview of Key Elements and Criteria Guiding Principles for developing Scales Lessons learned- Interviews and Communications 4 5 6 7 8 9 10 11 12 13 14 15 18 19 Project Team Structure Overview and Composition Roles and Responsibilities 20 21 22 2
Table of Contents (cont’d) Page Methodology for an IMCC Assessment Overall Methodology Step 1—Planning Step 2—Data Collection Step 3—Consolidation of Results Step 4—Validation of Findings Step 5—Developing an Action Plan 23 24 27 29 32 36 37 Elements and Scale Descriptions Additional Resources 41 42 3
Overview IM Capacity Check Tool 4
Background – IM Capacity Check – Library and Archives Canada developed the IMCC to help federal government organizations assess their current IM capabilities against industry standards and best practices and develop a strategic plan to improve their IM Capacity. – The IMCC was developed in collaboration with stakeholders, other central agencies and subject matter experts in the public and private sectors. – Proof of concept was achieved in three successful pilot projects. The first was done on a department-wide basis, the second in a government cluster environment and the third at the project level. – Over thirty (30) federal government departments and agencies have successfully completed an IMCC self-assessment and the IMCC has had significant national and international attention. 5
Intellectual Property – Usage Restrictions • The IMCC Tool may only be used in accordance with the following: – The IM Capacity Check Tool has been designed for the use of (Canadian) federal departments and agencies, or other parties working on their behalf. This condition does not preclude third party organizations providing chargeable services utilizing this product in support of the federal government IM Capacity Check selfassessment. Third parties may utilize the IM Capacity Check for selfassessment but no third party may use this product for commercial gain outside the intended use for the (Canadian) federal government. – Use of the IM Capacity Check Tool must acknowledge and identify Bearing. Point as the owner of this product. Departments and agencies have the right to adapt the product, and could do a selfassessment on their own or engage the services of consultants to help them carry out an assessment. Any adaptation must still continue to acknowledge and identify Bearing. Point as a source of this product. 6
Objectives IM Capacity Check – Assess state of IM practices within each organization against a common standard. Assess current information management practices against recognized best practices and principles and identify level of “Capacity“. – Bring together all the elements of information management practices. The Capacity Check is intended to integrate the full range of capabilities necessary to implement IM. – Compare against best practices. The Capacity Check is based on generally accepted best practices, and therefore provides an opportunity for organizations to assess where they stand relative to these best practices. – Develop plans for improvements to their information management practices. Organizations will be able to prioritize improvements in IM capabilities and pursue high priority opportunity areas. 7
Key Characteristics IM Capacity Check • Intended as a diagnostic tool for senior management of the organization. The IM Capacity Check focuses on: – Future direction – What capabilities must be in place in the future to respond to emerging client demands/changing environment. – Capacity - Expanding/improving organizational capability. – Priorities - Recognizes that an organization can only focus on selected improvement areas at any one time, and cannot be “best” at everything. – Competencies - Helps identify the information management competencies needed to move forward. – Senior management - Intended as a diagnostic tool for senior management of the organization. – Self Assessment - Directed self-assessment tool. Information is collected through interviews/workshops, and then validated by managers collectively. – Support for current change - Builds upon changes already underway to existing information management processes. 8
Capacity Check – Concept of Capabilities n Capabilities includes people, skills, processes, technology, policy, management framework and resources 9
IMCC Elements and Criteria 10
Principles for Element/Criteria Development – Bring together the key Elements of information management practices. The Elements reflect the integration of capabilities necessary to efficiently and effectively implement information management at the enterprise level. – The Elements are based on best practices and expert advice. The Elements are drawn from generally accepted best practices and from subject matter experts. – The Elements are sufficiently robust to apply across multiple GC organizations. The Elements reflect common areas of capacity building for information management. This provides the opportunity for GC organizations to assess their standing relative to a common set of best practices. – The Elements collectively define a comprehensive baseline. The Elements help to establish a baseline for IM capacity building, priority setting and action planning. 11
Elements of IM Capacity 12
Element Descriptions Organizational Context – Defines criteria to assess an organization’s capacity to support, sustain and strengthen IM capabilities. §Culture §Change Management §External Environment Organizational Capabilities – Defines the criteria to assess an organization’s capacity to develop the people, process and technology resources required for sound IM. §IM Community §Expert Advice §IM Tools §Technology Integration §Portfolio Management §Project Management §Relationship Management of IM – Defines criteria to assess an organization’s capacity to effectively manage activities in support of IM as it relates to the effective delivery of programs and services. §Program Integration §Leadership §Risk Management §Strategic Planning §Roles and Responsibilities§Performance Management §Principles, policies and standards Compliance and Quality – Defines the criteria to assess the organization’s capacity to ensure its information holdings are not compromised. §Information quality §Security §Privacy §Business continuity §Compliance Records and Information Life Cycle – Defines the criteria to assess the organization’s capacity to support each phase of the records and information life cycle. §Planning §Collect, create, receive and capture §Organization §Use and dissemination §Maintenance, protection and preservation §Disposition §Evaluation User Perspective – Defines the criteria to assess the organization’s capacity to meet the information needs of all users. §User awareness §User satisfaction §User training and user support 13
Level/Scale Descriptions § Capacity 1 – Initial (No systematic or formal approach exists for this capacity. Processes and practices are fragmented or non-existent. Where processes and practices exist, they are applied in an ad-hoc manner. ) § Capacity 2 – Defined (Processes and practices are defined to varying degrees and are not applied consistently. Basic management controls and disciplines for the capacity are in place. ) § Capacity 3 – Repeatable (Processes and practices are defined, well understood and used consistently across the organization. Processes and practices are also well documented. ) § Capacity 4 – Managed (A well-defined framework exists for this capacity. Process and practices are measured and managed to ensure delivery of desired results. Process and practices are embedded in the values of the organization and are coordinated in an integrated manner. ) § Capacity 5 – Optimizing (Focus on continuous improvement of the capacity. The concepts of innovation, organizational learning and continuous improvement of the capacity are incorporated into the values of the organization and are consistently applied. ) 14
The IM Capacity Check Tool – Elements and Criteria 15
Key Elements of the IMCC Tool 1. Organizational Context n Culture Recognition by the organization that information is a strategic corporate asset requiring stewardship. Degree of support and reinforcing behavior that is consistent with these values. n Change Management Mechanisms to facilitate the adoption of change within IM and related initiatives. n External Environment The extent to which the organization conducts environmental scans and assesses their possible impacts on IM. 2. Organizational Capabilities n IM Community The extent to which IM specialists have the competencies and capacities to meet the challenges of IM on a sustained basis. n Expert Advice Extent to which expert advisors are available and utilized for objective commentary and independent advice for supporting IM. n IM Tools The extent to which IM tools efficiently and effectively support IM. n Technology Integration The degree to which IM enabling technologies are integrated across the organization to support the delivery of information, programs and services. n Portfolio Management Extent to which mechanisms to plan, track, and evaluate the overall IM project portfolio are available to staff. n Project Management Extent to which mechanisms to manage projects in the organization exist to ensure the optimal design, development and deployment of IM initiatives. n Relationship Management The extent to which mechanisms or processes exist to facilitate partnerships and consultations between organizations (public and/or private) and other stakeholders in support of effective IM. 3. Management of IM n Leadership The extent to which senior management is aware, understands, demonstrates commitment to a clear vision and set of strategic objectives for IM. n Strategic Planning Quality of strategic, business and operational plans for IM, and the linkages between plans, costs, benefits, resources and controls. n Principles, Policies & Standards Existence and use of a corporate policy and management framework to effectively support IM. Degree to which IM principles, policies and standards exist, are understood and applied within the organization. n Roles and Responsibilities The extent to which roles, responsibilities, performance expectations, ownership and accountabilities are clearly defined, understood and accepted. Appropriateness of the organizational and governance structures to support IM. n Program Integration Extent to which the organization’s programs and projects proactively and efficiently integrate IM principles, policies and standards. n Risk Management Mechanisms for identifying, measuring, and monitoring relevant risks for IM, including options for risk allocation and risk mitigation. n Performance Management Extent to which the achievement of financial and operating results are embedded into the performance management framework for IM. 16
Key Elements of the IMCC Tool 4. Compliance & Quality n n n Information Quality The extent to which the organization’s processes for ensuring information is accurate, consistent, complete and current. Security Extent to which mechanisms are in place to ensure information is protected from unauthorized access, use and destruction. Privacy Mechanisms to ensure that an individuals rights to privacy in the collections and disclosure of information are respected. Business Continuity The existence of contingency plans and mechanisms to ensure timely information recovery, the restoration of essential records and business resumption in the event of information corruption or loss. Compliance The extent to which audit and review processes are in place to ensure awareness of and compliance with applicable IM legislation, policies and standards. 5. Records and Information Life Cycle Management Planning The extent to which information life-cycle requirements are incorporated in the development of policies, programs, services and systems. n Collection, Creation, Receipt and Capture The extent to which information collection, sharing and re-use are optimized and decisions are documented. n Organization The extent to which information is identified, categorized, catalogued and stored to effectively and efficiently support the business process. n Use and Dissemination The extent to which the organization’s information can be located, retrieved and delivered to provide users with timely and convenient access. n Maintenance, Protection and Preservation The extent to which the long-term usability and safeguarding of information is ensured. n Disposition The extent to which organizational retention and disposal plans are followed to ensure the timely disposition of information, subject to legal and policy obligations. n Evaluation The extent to which an organization can assess the overall compliance and performance of its information management program. (cont’d) 6. User Perspective n n User Awareness The extent to which information users are aware of organization’s information products and services. User Training & User Support The availability of user training and support programs to facilitate the access and use of information. User Satisfaction Mechanisms to measure, evaluate, and learn from user feedback on information products and services. 17
Guiding Principles for the IMCC Scales § The capacity scales have entity-wide relevancy; § The capacity scales are sufficiently flexible to apply to other entities and agencies; § The capacity scales are incremental, that is, each capacity level within a scale builds on the previous capacity level of the scale; § Relative consistency in the description of the capacity levels across all the scales, i. e. , 1 is 1; § Each capacity level description is homogeneous and does not represents more that one level of capacity. 18
Lessons Learned – Interviews and Communications – Need to prepare interim report in order to confirm issues, gaps in information, and topics to pursue in greater detail. – A briefing is required at outset of project in order for the consultants to have a basic knowledge of the organization in terms of structure and lines of business. – The interviewee mix must consist of policy/operational interviewees to get an adequate cross-section. . – Although not an audit, it is necessary to get some examples of the types of information reported/ examples of documentation (e. g. , IM plans). – Telephone interviews are not as effective. Conduct in person interviews to the extent possible. – Due to the levels of interviewees (i. e. ADM/DG), some areas should be covered off with higher level questions. Also, keep communications at a high level prior to interviews. – Team members should be provided with an orientation which highlights the process steps. This will help during the interviews. – The interview process must be fluid. Probe the applicable areas and summarize the areas of non-involvement. 19
Project Team Structure and Expected Involvement of Organizational Staff 20
Overview and Composition of Project Team • Overview of the Project Team – Multi-disciplinary team • Possible Composition of the Project Team – Organizational Business/Program representatives – Reviews IM documents – Assists in interviews – Helps determine IM Capacity “as is” and “to be” – Identifies opportunities and helps set priorities – Organizational IM representatives e. g. records, library, security, ATIP, etc. – Library and Archives Canada Representative(s) – Internal Audit – IT Representatives – Participates in “validating” results and reviews final report – Other members as deemed appropriate 21
Project Team Roles and Responsibilities Project Team Members will participate in: The typical Project team, consisting of 10 -15 members • • Typical level of commitment and effort required for Project team members is approximately 4 - 8 days, over the 3 - 4 months of the project. • IM capacity check training session Data collection Interviews and workshops Consolidation workshop Validation workshop along with additional senior managers brought in as validators Reviewing draft and final reports Role and commitment of Chair of Project Team: • • Single point of contact for the client Project team members Sits on the Steering Committee Focal point for communications planning Provides QA role Responsible for ensuring team’s logistics Garnering executive support Client spokesperson for internal and external briefings 22
Methodology for an IMCC Assessment 23
Overall Methodology and Timeline for Assessment – Joint consultant-organizational team (Project Team) is trained in implementing Capacity Check. – A mix of techniques are used to collect the information to do the assessment, including workshops, interviews, and review of documentation. – Senior management from the organization being assessed is involved throughout the process. – Findings are consolidated an assessment is done by joint consultant-organizational Project team. – Follow-up group sessions are held with a different cross section of senior management team from the organization being assessed, and the Project Team to validate the findings, the capability ratings and the opportunities for improvement. . – The final step is preparation and approval of a report outlining the current and future state and an action plan for the resulting opportunities and priorities. 24
Overall Methodology and Timeline for Assessment (cont’d) 3 – 4 months 2 Core project team Data Collection 4 1 Experts in: 5 Project Team: 0. 5 - 1 days Project planning • IM • Program delivery Project Team: 1 - 2 days • Information technology • Organizational context Validation 3 Consolidate findings Action Plan Project Team and Senior Management: . 1 - 2 days Project Team: 2 - 2. 5 days • User context Organizational managers who are knowledgeable of the organization’s IM practices 25
IMCC Methodology Note: The following represents an overview of the IMCC Methodology. LAC recommends that the Methodology be followed as shown to obtain maximum input from stakeholders and organization-wide buy-in for planned priorities. 26
Step 1 Planning DELIVERABLES ACTIVITIES 1. 1 Project initiation n Workplan and schedule n Training manual n Interviewee list/ no. of interviews n Interview/workshop guide and info. Package 1. 6 Develop communications plan n 1. 7 Presentation to senior management (if desired) List of documentation to be reviewed n Communiques 1. 2 Briefing of organization 1. 3 Project team training 1. 4 Workshop/interview planning 1. 5 Documentation review Challenges: n Obtaining management commitment and resources support to the project. n Establishing an interviewee list that is representative of the organization (e. g. , by management level, sector, region, operational versus policy). Between 8 to 10 interviews and 1 to 2 interview workshops are standard. n Customizing and establishing the appropriate level of detail for the interview guide and information package to be distributed to managers. 27
Step 1 - Sample Assessment Timeline 28
Step 2 - Data Collection ACTIVITIES 2. 1 Develop/adapt interview and workshop guides 2. 2 Schedule and Conduct interviews and workshops 2. 3 On-line survey (if and when required) 2. 4 Documentation review DELIVERABLES n Interview and workshop guides n Interview and workshop notes n On-line survey results (when required) n Results of documentation review n Interview/workshop notes reported by criteria 2. 5 Summarize findings Challenges: n Ensuring all criteria are covered through interviews/workshops. n Reporting findings in a systematic/structured approach based on Capacity Check criteria. n Confidentiality of interviews. n Focus on actual status rather than hypothetical (i. e. , clearly distinguish between what the organization should be doing versus what the organization is actually doing at this time). 29
Step 2 - Data Collection Approaches Interviews with senior managers and other internal and external stakeholders. Document and literature review. IM Capacity Check Assessment Project Workshops and site visits. 30
Step 2 - Data Collection Process Interviews tend to be for executive management. The number of interviews depends on the size and breadth of the organization— typically 12 -15, but will vary. Interviews questions are sent to interviewees in advance. A Project Team member attends with the consultant conducting the interview. The Project Team member helps in taking notes and consolidating findings Workshops tend to be for senior staff/management. Typical size is 12 -15 participants. Two workshops should be enough for most organizations. Data collection efforts include organizational policies and guides, performance reports, org structure, governance models, IM and strategic plans. Any relevant documentation identified during the interviews is also collected and reviewed. 31
Step 3 - Consolidation of Results DELIVERABLES ACTIVITIES 3. 1 Consolidate findings by criteria 3. 2 Project team workshop to summarize findings and opportunities for improvement/ issues for each criteria 3. 3 Project team establishes preliminary rating for each criteria, and rationale n Findings/ opportunities by criteria n Conclusions reported by major element n Ratings as per criteria, “as-is” and “to-be” 3. 4 Prepare preliminary report Challenges: n Identifying overall patterns/trends given sectors within organization often operate in a different business environment. n Ratings and priorities should be secondary to findings, conclusions and improvement opportunities. 32
Step 3 - Consolidated Results Template (prepared for each criteria…) Findings and Issues Opportunities q Summary of findings and issues related to each criteria is inserted. q Identified opportunities related to each criteria are inserted. CAPACITY Roles and Responsibilities The extent to which IM roles and responsibilities are clearly defined, understood and accepted. Appropriateness of the organization and governance structures to support IM. 1 IM roles and responsibilities are not well defined. The organization and governance structures are not appropriate for the management of IM initiatives. 2 3 IM roles and responsibilities are generally defined but not well understood. Some overlaps and gaps exist vis -à-vis roles and responsibilities. Minimal governance structures exist in support of IM. The IM governance structure may be fragmented or inappropriately positioned within the organization. IM roles and responsibilities are clearly defined and understood, and generally aligned with the organization’s objectives. Little or no overlaps or gaps in IM responsibilities exist. The governance structure is appropriately positioned within the organization. Effective governance structures are in place. 4 Changes to IM roles, responsibilities, organization and governance structures are made quickly and proactively following regular consultation with stakeholders. 5 An IM champion is responsible for ensuring the integration of IM practices across both administrative and program areas. IM roles, responsibilities, organization and governance structures are continuously reviewed and updated to reflect changing business and technology environments. 33
Step 3 - Assessing the Capabilities- “As Is” and “To be” assessment - Sample – Current capabilities are assessed based on key elements of the IM Capacity Check, and criteria provided for each key element. – The capabilities depicted within the criteria represent different states or plateaus that the organization may strive to achieve. The descriptions are incremental. – The capability descriptions are based on generally recognized best practices, but have been customized to reflect the Government of Canada context. – The Organization identifies which level of "maturity" would be the most appropriate in support of its business needs, priorities and consistent with its capabilities. – A rating system of “ 1” to “ 5” is used. A rating of “ 5” does not necessarily mean “goodness”, but rather, maturity of capability. The ideal maturity rating for any area is dependent on the needs of the Organization. CAPACITY Roles and Responsibilities The extent to which IM roles and responsibilities are clearly defined understood and accepted. Appropriateness of the organization and governance structures to support IM. 1 2 3 4 5 IM roles and responsibilities are not IM roles and responsibilities are clearly Changes to IM roles, responsibilities, An IM champion is responsible for well defined. The organization and generally defined but not well defined and understood, and generally organization and governance structures ensuring the integration of IM governance structures are not understood. Some overlaps and gaps aligned with the organization’s are made quickly and pro-actively practices across both administrative appropriate for the management of IM exist vis-à-vis roles and responsibilities. objectives. Little or no overlaps or following regular consultation with and program areas. IM roles, initiatives. Minimal governance structures exist in gaps in IM responsibilities exist. The stakeholders. responsibilities, organization and support of IM. The IM governance structure is appropriately governance structures are structure may be fragmented or positioned within the organization. continuously reviewed and updated inappropriately positioned within the Effective governance structures are in to reflect changing business and organization. place. technology environments. Existing maturity Where the organization may strive to be in the future Future capability 34
Step 3 - “As-Is” and “To-Be” Assessments overview Legend As-Is: To-Be: 1 Organizationbal Context 2 3 4 5 Culture Change Management External Environment Organizational Capabilities IM Community Expert Advice IM Tools Technology Integration Portfolio Management Project Management Relationship Management of IM Leadership Strategic Planning Principles, Policies and Standards Roles and Responsibilities Program Integration Risk Management Performance Management Compliance and Quality Information Quality Security Privacy Business Continuity Compliance Records and Information Life Cycle Planning Collect, Create, Receive and Capture Organization Use and Dissemination Maintenance Protection and Preservation Disposition Evaluation User Perspective User Awareness User Training and Support User Satisfaction 35
Step 4 - Validation of Findings ACTIVITIES 4. 1 Conduct workshops with Senior Managers (Validators) and the Project Team to validate findings, conclusions and ratings 4. 2 Discuss relative importance of criteria and opportunities for improvement DELIVERABLES n Validation of the findings, conclusions and ratings n Opportunities for improvement n Establish 5 -10 priorities 4. 3 Update report Challenges: n Adopting a forward looking strategic approach. n Being honest about major improvement opportunity areas. n Reaching consensus on priority areas of improvement. n The Validators may dictate the approach for the deliverables of this session. This methodology is sufficiently flexible to allow a customized approach to capture the deliverables. 36
Step 5 – Developing an Action Plan ACTIVITIES 5. 1 Present results to senior management 5. 2 Senior management assesses where the organization should be in terms of “target” capability ratings 5. 3 Senior management prioritizes the criteria and opportunities DELIVERABLES n “Target” capability ratings n Relative priority of criteria n Ranking of opportunities n Implementation strategy and action plan 5. 4 Develop action plan Challenges: n n Senior management needs to take ownership of the results. Developing an action oriented plan. The organization needs to focus on the high priority areas. Summarizing and reporting the findings in a manner that accommodates needs of different levels of management. The results will need to be communicated on an organization-wide basis. Need to maintain linkages between the various initiatives currently ongoing in the organization in support of IM. 37
Step 5 - Summary of Priorities and Opportunities - Template high To facilitate the prioritization of the projects, we graph them in the chart below, based on two factors: level of effort to implement, and expected impact that the initiative will have on the organization. Those of low effort and high impact may be likely candidates to begin with, to gain some initial successes. Major Change Opportunity 5 medium Opportunity 2 Opportunity 4 Opportunity 1 Opportunity 3 Administrative Low Hanging Fruit low Effort Question Mark low medium IMPACT high 38
Step 5 - Transition Map – Template This graph outlines the various opportunities and their intended occurrence over time. Short Term Medium Term Long Term Opportunity 1 Opportunity 2 Opportunity 3 Opportunity 4 Year 1 Year 2 Year 3 39
Step 5 - Contents of Assessment Report • • Executive Summary • Key Themes • Summary of Findings • Highlights of Findings • Projects • Action Plan Background Overview • Objectives of the Capacity Check • Key Characteristics • Key IM Elements Examined • The Mechanics of the Capacity Check Project Objectives, Scope and Process Overview Summary of IM Capacity Check Assessment findings/ opportunities (by criteria) Lessons Learned Appendix A - Background Information • Interviews • Workshops • Documents Reviewed 40
Element and Scale Descriptions Note: Due to proprietary considerations, full descriptions of the original 6 IMCC Elements and 32 Criteria and the rating scales are only available from Library and Archives Canada. Contact us via e-mail at IMGI@lac-bac. gc. ca 41
Additional Resources After a IM Capacity Check is completed Library and Archives Canada has a number of Guides, Tools and Best Practices to help improve IM Capacity in your areas of priority and need. Additional material useful for conducting an IM Capacity Check assessments such as Interview Guides and Communiqués are also available. For information please contact: Information Management Centre Email: imgi@archives. ca Telephone: (819) 934 -7519 42