Back to top anchor
Resources

Policy Capability Framework: review tool

Issue date: 
Friday, 27 November 2020
Issue status: 
Current

What is the Policy Capability Framework? 

The Policy Capability Framework (PCF) is a performance improvement tool. It aims to help agencies review and improve the overall policy capability of their organisations. It describes the key components of capability as relating to policy, and offers related lines of inquiry and potential indicators for reviewing that capability. 

The tool is not a ‘how to’ guide. Rather it's designed to prompt reflection and courageous conversations about current policy performance, and to support agencies to develop strategies and priorities to improve that performance. It covers four main dimensions of capability that were identified by policy leaders as critical in a high performing policy shop:  

  • People capability 
  • Policy quality systems 
  • Engagement and customer-centricity 
  • Stewardship

The tool draws on the Performance Improvement Framework (PIF) model. Where the PIF deals with overall organisational capability and performance, the PCF facilitates a deep-dive review of policy capability. The PCF sits in the context of overall organisational capability and substantive government policy settings and priorities. 

Why a Policy Capability Framework?  

Policy leaders are all interested in building the capability of their policy functions. Often improvement efforts are driven by enthusiastic individuals without a model or tools to guide the process. The upshot is a tendency to ‘reinvent the wheel’ which in turn denies the wider policy community the opportunity to learn from improvement processes in other agencies. As a system, we have no collective view on what a high-performing policy shop looks like and how to get there.

By capturing the experience of policy leaders, the PCF supports agencies to build on the experience of others and helps build overall policy capability and the quality of policy advice across government – the ultimate aim of the Policy Project.

The PCF was co-produced by policy leaders and launched by the Head of the Policy Profession and the Prime Minister in August 2016. 

How to use the Policy Capability Framework  

Getting started  

  • A PCF review is about seeking capability improvement. It's not a retrospective audit. The aim is to understand current capability, and to set an improvement trajectory towards a desired future state. 
  • A clear mandate from the leadership team (agency or policy team) for the self-review is critical.  
  • Staff need to feel involved and part of the review process – ‘doing with’ rather than ‘doing to’ – and should have time allocated to participate in the diagnosis and design of improvement solutions.
  • The policy team needs to have an open mind and be prepared to challenge itself.  

 

How the Policy Capability Framework can be used for self-review

Approach and process

There are different ways the PCF can be used. Things to consider for a self-review are:

  • Review team – the team may be the leadership of the policy function (e.g. using the tool to prompt a conversation at a senior leaders/managers ‘away day’) or a purpose built team (e.g. three to five people from across the function) who then engages with staff and the leadership of the policy function to moderate its findings.
  • Background information – relevant existing information can inform the review (e.g. recent PIF, responses to the Treasury policy measurement exercise, engagement/policy quality/ministerial satisfaction scores, and workforce data).
  • Applying the PCF tool – begin by considering the lead questions for each element of the four dimensions. If you have difficulty answering any of those lead questions, consider the more detailed lines of inquiry and indicators, which will help you answer the lead questions. The self review will probably get bogged in detail if it attempts to answer every individual line of inquiry, or develop action plans for all 19 elements of the PCF. Applying the model takes judgement, based on evidence about current state, as well as insights and knowledge about identified trouble spots or gaps. The PCF is intended to stimulate a discussion about the capability of your agency’s policy function, and to identify your areas of strength and priority actions in a small number of PCF elements to improve overall policy capability.
  • Ratings – maturity levels can be used to rate current capability against the PCF lead questions. They are optional, but can help guide decisions on the priority areas for capability improvement and on the levels of improvement sought.
  • Timing – the action plan should include a defined schedule and timeline (who will do what by when). This is important to ensure that momentum to improve capability is maintained and findings remain current. Short and sharp is better than a drawn out process. Taking action sooner rather than later is preferable. Ideally a PCF review should be undertaken every two years, to assess progress, and reset priorities and the schedule of actions as appropriate.
  • Reporting – the report on the PCF self-review should summarise the key findings (e.g. referencing the lead questions) and conclude by setting out the four to five key things that the team and leadership agree they should focus on to improve capability.

Follow up and follow through

After the PCF self-review is completed, the leadership of the policy function should communicate the findings and follow up, acknowledging the efforts of the self-review team and staff. In particular:

  • Agreed priority areas for capability improvement, the plan to progress them, and who is responsible for taking things forward should be clear.
  • Regular updates on progress with opportunities to acknowledge and celebrate success as well as open channels for feedback on and iteration of improvement strategies (‘learn as you go’) are preferable. This will help to maintain momentum and mitigate the risk of returning to business as usual.

Inviting others in – from self to peer to external review

The PCF is intended in the first instance as an internal self-review tool for policy teams. An external ‘fresh set of eyes’ perspective can add an extra layer of insight. For example:

  • Non-policy input – including someone from another function in the agency (e.g. from operations, finance, HR) can draw insights about how the policy function is perceived by others and explore the interface between functions in the agency.
  • Critical friend peer review – including someone from an external agency in the self-review team (perhaps someone who has used the PCF in their policy team) can enable cross-fertilisation of ideas and neutral challenge.
  • Independent/external review – the PCF can also be used as the basis of a more detailed assessment, including by independent external reviewers.

Sharing the lessons – building capability across the system

The Policy Project team is available to support agencies to undertake a self-review.

We encourage review teams to document their journey and share their lessons learnt. We are interested in both the process of designing an improvement trajectory and in how useful the PCF was. Knowing this will help us to improve the PCF for future users.

Get in touch at policy.project@dpmc.govt.nz

Applying this review tool

The following section presents the four dimensions of the Policy Capability Framework. Under each of these dimensions is a series of elements, which are each accompanied by the lines of inquiry and potential indicators. After reviewing the lines of inquiry and indicators under each element, record your answers to the following three questions:

  1. Where are we now?
  2. Where do we want to be and by when?
  3. What will we do to get there?

Collectively, your answers to questions 1 and 2 provide an in-depth assessment of the current state of the policy capability of your agency or your policy team. Your answers to question 3 can form the basis of your action planning to improve future performance. 

Elements of the Policy Capability Framework

Policy quality systems

Build the systems and processes that support the delivery of quality policy advice

The Policy quality systems dimension contains the following elements:

  • Commissioning
  • Planning and project management
  • Research, analysis and knowledge
  • Quality assurance
  • Evaluation and learning.

Commissioning

Lead question:

  • How well does the team use appropriate systems and processes to ensure that the supply of policy advice meets demand and has impact?

Lines of inquiry / Indicators:

  • Is the policy intent/commissioned product clear from inception? Is there ‘free and frank’ challenge where necessary (where an alternative approach/process might have more chance of delivering policy intent)? Are appropriate commissioning tools, templates and guidance made available and consistently used by policy staff? To what extent are policy staff able to be present at meetings with senior officials/ministers when work is commissioned? What strategies are in place to avoid policy intent being ‘lost in translation’ (including through relationships with ministerial office staff)?
  • How is proactive, unsolicited, policy advice offered and received (e.g. proposing changes to policy settings or transformative policy shifts)?

Planning and project management

Lead question:

  • How well does the team ensure that the right policy outputs are delivered, on time, using the most efficient mix of resources?

Lines of inquiry / Indicators:

  • How are resources prioritised to the highest value work, and low value work deprioritised/stopped?
  • How are policy outputs costed, and how is this information used for planning, prioritisation and resource allocation? Are outputs typically delivered on time and budget?
  • Are ‘fit for purpose’ project management methods and tools effectively employed by policy staff? What templates and guidance are available to support the choice of method? Are project management skills present in the policy team?

Research, analysis and knowledge

Lead question:

  • How well is the policy team actively investing in building its knowledge base over time?

Lines of inquiry / Indicators:

  • How well does the policy team understand, keep up to date with and contribute to the body of knowledge in its field, including relevant literature, and evidence? Are key information gaps identified and is there a plan in place to address them? What systems are in place for recording and accessing relevant previous approaches to policy issues, current evidence (local and international) and anticipating future trends? Are policy staff clear about the set of analytical tools they are required to have proficiency in? Is there good data architecture? Is knowledge (not just data) being generated?

Quality assurance

Lead question:

  • How effective are policy quality assurance processes?

Lines of inquiry / Indicators:

  • What quality assurance and/or peer review processes are in place? Are all policy outputs reviewed for accuracy, formatting and clarity of message? Do the authors of papers receive regular feedback?
  • Are quality ratings from internal and external checks good? Is the robust methodology of the Policy Quality Framework consistently applied when assessing policy advice deliverables?
  • Is ministerial and stakeholder feedback solicited? Is feedback positive/on an upward trajectory?

Evaluation and learning

Lead question:

  • How well is evaluation and learning embedded into business as usual?

Lines of inquiry / Indicators:

  • Is the impact of policies within the agency's or team’s area of responsibility subject to systematic monitoring and evaluation? How are results documented? What investment is there in benefits monitoring, learning and evaluation? Does this inform future policy development? How well are the insights, information and knowledge produced through policy processes systematically captured, shared and used to inform future improvement strategies?

People capability

Ensure the right skills are in the right place at the right time

The people capability dimension contains the following elements:

  • Team make-up and diversity
  • Career paths and progression
  • Development and training
  • Decision rights and enablers
  • Work allocation.

Team make-up and diversity

Lead question:

  • How well does the policy team ensure it has the skills and diversity to achieve its purpose, including the right mix of new talent and experience?

Lines of inquiry / Indicators:

  • Is there an explicit strategy for the make-up and diversity of the team (using the Policy Skills Framework)? Does it ensure the team is fit for purpose/able to deliver on strategy and priorities over time?
  • Is there a good balance between specialists (subject matter experts providing depth) and generalists (providing breadth, including management skills)? Does the team include transformational, not just transactional, policy expertise? How is institutional knowledge maintained and built?

Career paths and progression

Lead question:

  • How well are career pathways, rewards and progression opportunities effectively managed?

Lines of inquiry / Indicators:

  • Is there an explicit career progression strategy? How are high performing staff rewarded and retained? How are high potentials developed – to ‘grow or go’? How well are junior staff developed to progress to more senior roles? How effective is succession planning – are (some) senior roles filled internally?
  • How are opportunities to participate and share capability across government encouraged (including through secondments, cross-agency teams)?

Development and training

Lead question:

  • How well do managers know what skills the team needs and how they are going to develop and maintain them?

Lines of inquiry / Indicators:

  • Is there an explicit staff development strategy – the ‘what’ (e.g. broad versus deep capability), and the ‘how’ (e.g. 70/20/10 model)?
  • Do all policy staff understand the ‘policy basics’ (e.g. legislative and Cabinet processes, agency policy processes, analytical tools and methods, choice of policy instruments – see the Policy Quality Framework)? How well are staff provided with performance feedback that enables them to set a trajectory for developing their policy skills?
  • To what extent is staff induction, development, and training prioritised and resourced?
  • How are staff encouraged and enabled to have good external connections (including with other agencies, stakeholders, academia and international counterparts) and to keep up with the latest thinking?

Decision rights and enablers

Lead question:

  • How well are staff provided with autonomy commensurate with their experience, and provided with adequate assistance when making decisions that stretch them?

Lines of inquiry / Indicators:

  • Is responsibility for policy advice outputs/activities devolved to the lowest possible level?
  • How are staff provided with advice, frameworks and tools to help them assume responsibility for decisions up to the level of their competence and the agency’s risk management/tolerance?

Work allocation

Lead question:

  • How well does the distribution of work support staff development and resilience?

Lines of inquiry / Indicators:

  • How well is work distributed amongst staff? Are there some staff that regularly have spare capacity or are regularly overloaded?
  • Is there an overreliance on experienced ‘policy stars’ to keep the policy machine running (key person risk)? To what extent are core staff (versus contractors) doing the key work? How does the distribution of work (in the team, buying in expertise) support building in-house capability?

Engagement and customer-centricity

Understand and meet the expectations of ministers, customers and other stakeholders

The engagement and customer-centricity dimension contains the following elements:

  • Ministers and Cabinet
  • Customers and other end users
  • Other agencies
  • Stakeholders
  • Frontline staff/delivery units.

Ministers and Cabinet

Lead question:

  • How well does the policy team provide advice and services to ministers and Cabinet?

Lines of inquiry / Indicators:

  • Do ministers (including non-responsible ministers) show confidence in the team and its advice? Is the policy team sought out to solve ministers’ problems rather than implement ministers’ solutions? Does the team consistently provide ‘free and frank’ advice?
  • Does the policy team understand the needs of multiple ministers and give joined-up outcome focused advice? Do leaders perform with confidence when fronting proposals in ministerial meetings/Cabinet committees?

Customers and other end users

Lead question:

  • How well does the policy team understand the agency’s customers and their needs?

Lines of inquiry / Indicators:

  • How well does the team explore ways to deliver value to citizens as customers? What methods are employed to generate insights about, solicit the views of, understand and respond to the various needs of those who will be affected by policy options? To what extent do insights about user needs influence policy options? How does the team ensure it considers customers’ short and longer term needs?

Other agencies

Lead question:

  • How well does the policy team work with other agencies to facilitate alignment and coordination across government?

Lines of inquiry / Indicators:

  • How does the policy team build and maintain effective relationships with key stakeholder agencies? How does the agency determine what needs to be managed across agencies/the system and when to do that? What contribution does the team make to policy alignment across government (e.g. ensuring minimal incidence of split recommendations in Cabinet papers)?
  • Do other agencies actively seek the input of the policy team or invite the team to participate in their policy processes?

Stakeholders

Lead question:

  • How well does the policy team collaborate with stakeholders?

Lines of inquiry / Indicators:

  • Does the team take a deliberate and systematic approach to engaging with key stakeholders (e.g. Māori) to build ‘relationship capital’? Are key stakeholders engaged in the policy process (including early in problem definition, not just consulted on solutions)? To what extent is there common ownership of key outcomes (and some co-production of solutions) with stakeholders?
  • Are relationships with stakeholders considered (mutually) effective? Do stakeholders feel heard, even when there is disagreement?

Frontline staff/delivery units

Lead question:

  • How well does the policy team engage across the agency, including with delivery units?

Lines of inquiry / Indicators:

  • Are policy processes characterised by end-to-end partnerships between the policy team and other agency staff?
  • How well does the policy team engage with delivery/frontline units (including delivery staff in other agencies where applicable) to understand the interface with end users and implementation requirements? Does implementation typically proceed smoothly, with room for iteration, without being negatively impacted by unforeseen issues?

Stewardship

Focus on policy outcomes and build capability for the future

The Stewardship dimension contains the following elements:

  • Leadership and direction
  • Policy quality systems
  • People capability
  • Engagement and customer-centricity.

Leadership and direction

Lead question:

  • How well do leaders articulate a clear vision of policy directions and a roadmap for achieving policy outcomes that benefit New Zealanders?

Lines of inquiry / Indicators:

  • How does the agency shape and influence the broader policy agenda and engage others in that vision (agency and wider system including government and sector goals)?
  • To what extent do policy leaders demonstrate the importance of visioning, exploration and debate about emerging strategic issues?

Strategy and priorities

Lead question:

  • How well does the policy team know what it is trying to achieve and its contribution to agency, sector and system policy objectives?

Lines of inquiry / Indicators:

  • How do leaders ensure a steadfast focus on better public value?
  • How well does the team understand its environment and foresee upcoming trends, issues and demands? Can staff articulate what they are trying to achieve? Is the policy team strategic, and able to deliver proactive and long-term policy advice as well as being responsive to immediate ministerial priorities? Are trade-offs deliberate and based on clarity about what matters most? Are resources safeguarded for longer-term work, and less important work deprioritised?
  • How well is the work agenda driven by current and anticipated future demands (not by what current capability can supply)?

Culture

Lead question:

  • How well is a culture of achieving outcomes, constructive challenge, innovation and continuous improvement promoted and maintained?

Lines of inquiry / Indicators:

  • How are credible and robust discussions within and between policy teams encouraged? How are opportunities presented to consider different approaches to policy challenges (i.e. to invite innovation)?
  • To what extent do staff demonstrate that they are motivated, engaged and invested in the mission of the policy team and agency? How well do leaders drive and enable high performance?
  • Does the reputation of the policy team mean it is sought after for opinions and input? Do people want to work here?

Investment in future capability

Lead question:

  • How well does the team plan and resource to build future policy capability (both policy content and people)?

Lines of inquiry / Indicators:

  • Are leaders committed to organisational learning and growing policy capability?
  • Is there a clear plan for investing in capability that might be needed in the future, including through knowledge management and a research strategy? How well are knowledge gaps identified (e.g. scanning) and a clear plan for addressing them developed? Are policy staff consistently striving to improve their capabilities (e.g. training, stretch goals)?
Last updated: 
Friday, 29 October 2021

Help us improve DPMC

Your feedback is very important in helping us improve the DPMC website.