Report No: AUS0000222 . Europe and Central Asia Development of EU Governance Indicators Indicators of Citizen-Centric Public Service Delivery – Final Report . March 16, 2018 . GOV . . Document of the World Bank . © 2017 The World Bank 1818 H Street NW, Washington DC 20433 Telephone: 202-473-1000; Internet: www.worldbank.org Some rights reserved This work is a product of the staff of The World Bank. The findings, interpretations, and conclusions expressed in this work do not necessarily reflect the views of the Executive Directors of The World Bank or the governments they represent. The World Bank does not guarantee the accuracy of the data included in this work. The boundaries, colors, denominations, and other information shown on any map in this work do not imply any judgment on the part of The World Bank concerning the legal status of any territory or the endorsement or acceptance of such boundaries. Rights and Permissions The material in this work is subject to copyright. Because The World Bank encourages dissemination of its knowledge, this work may be reproduced, in whole or in part, for noncommercial purposes as long as full attribution to this work is given. Attribution—Please cite the work as follows: “World Bank. {YEAR OF PUBLICATION}. {TITLE}. © World Bank.” All queries on rights and licenses, including subsidiary rights, should be addressed to World Bank Publications, The World Bank Group, 1818 H Street NW, Washington, DC 20433, USA; fax: 202-522-2625; e-mail: pubrights@worldbank.org. Indicators of CITIZEN CENTRIC Public Service Delivery Indicators of CITIZEN CENTRIC Public Service Delivery © 2018 International Bank for Reconstruction and Development/The World Bank 1818 H Street NW, Washington, DC 20433 202-473-1000 | www.worldbank.org Some rights reserved. The findings, interpretations, and conclusions expressed in this work do not necessarily reflect the views of The World Bank, its Board of Executive Directors, or the governments they represent. The World Bank does not guarantee the accuracy of the data included in this work. The boundaries, colors, denominations, and other information shown on any map in this work do not imply any judgment on the part of The World Bank concerning the legal status of any territory or the endorsement or accep- tance of such boundaries. Nothing herein shall constitute or be considered to be a limitation upon or waiver of the privileges and immunities of the World Bank, all of which are specifically reserved. Rights and Permissions The material in this work is subject to copyright. Because the World Bank encourages dissemination of its knowledge, this work may be reproduced, in whole or in part, for noncommercial purposes as long as full attribution to this work is given. Any queries on rights and licenses, including subsidiary rights, should be addressed to World Bank Publications, World Bank Group, 1818 H Street, Washington, DC 20433, USA; fax 202-522-2625; email: pubrights@worldbank.org. Contents Acknowledments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi I Planning a Citizen-Centric Service Delivery Assessment: How and Why? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 What is Citizen-Centric Service Delivery, and What are Its Benefits? . . . . . . . . . . . . . . . . . 1 The Evolution of the Theory and Practice of Public Service Delivery . . . . . . . . . . . . . . . . . 3 How Can Citizen-Centricity Be Measured? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 II The Citizen Survey and Public Administrator Checklist . . . . . . . . . . . . . . . . . 8 Objectives and Structure of the Instruments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 What Do the Components Capture? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 Question Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 Answer Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 III Customizing the Citizen-Centric Service Delivery Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 Number and Sequencing of Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 Answer Scale . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 Target Group and Survey Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 Frequency of Enumeration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 IV. The Citizen Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 V. The Administrator Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 iii Appendixes A Administrator Checklist as Filled Out by a Municipal Registry Office: Illustrative Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 B Citizen Survey as Customized by a Municipal Registry Office: Illustrative Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 C Advantages and Disadvantages of Various Surveying Methods . . . . . . . . . . . . . . . . . . . 52 References and Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 Boxes 1.1 Citizen-Centric Service Delivery Indicators: What They Are Not . . . . . . . . . . . . . . . . . . . . . 1 1.2 Client, Beneficiary, User, and Citizen: A Clarification of Terminology . . . . . . . . . . . . . . . . 2 1.3 “Life is about Events, Not Agencies”: Building Government Services Around the Needs of New and Expectant Parents in New Zealand . . . . . . . . . . . . . . . . . . 5 1.4 Measuring Citizen Satisfaction: Caveats . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 Tables 2.1 Summary of Issues Explored in Citizen Survey and Administrator Checklist . . . . . . . . 10 2.2 Advantages and Disadvantages of Open- and Closed-Ended Questions . . . . . . . . . . . . 13 Acknowledgments This report was prepared in response to a request by the (Hertie School of Governance, Germany), Stephen Nix Economic Analysis Unit of the European Commission’s (International Republican Institute, USA), Mitchell Seligson Directorate-General for Regional and Urban Policy (DG REGIO) (Vanderbilt University, USA), Anwar Shah (Center for Public to the World Bank. Economics, China), Gabriel Sipos (Transparency International, Slovakia), Laura Sommer (Department of Internal Affairs, New The report was prepared by Hélène Pfeil with valuable con- Zealand), Santosh Srinivasan (Transparency International tributions from Sanjay Agarwal, David Bernstein, Francesca Secretariat, Germany), Ruslan Stefanov (Center for the Study Recanatini, Steve Knack and Peter Ladegaard. Sanjay Agarwal of Democracy, Bulgaria), Alexander Stoyanov (Center for coordinated the overall effort; while David Bernstein, task the Study of Democracy, Bulgaria), Johannes Tonn (Global team leader, provided strategic guidance. Integrity, USA), Stephanie Trapnell (George Mason University, USA), and Steven Van de Walle (KU Leuven Public Governance The team would like to thank the following peer reviewers Institute, Belgium). for their comments and helpful guidance on improving the final report: Jairo Alcuna-Alfaro (United Nations Development Finally, the team would like to thank Lewis Dijkstra, Head, Program), Dan Batista (Institute for Citizen-Centered Service, Economic Analysis Sector, DG REGIO for providing the trust Canada), Etienne Charbonneau (Ecole nationale d’admin- funds to support the Actionable Governance Indicator Project istration publique, Canada), Maksym Ivanya (Joint Vienna and for his personal guidance and leadership in the develop- Institute, Austria), Orla McBreen (Department of Public ment of this report. Expenditure and Reform, Ireland), Alina Mungiu-Pippidi v Introduction This report and the instruments it proposes are primarily The report is structured into five parts. Part I presents the aimed at public administrators who would like to ensure that conceptual framework that forms the backdrop for devel- their service delivery mechanisms respond to the needs and oping citizen-centric service delivery indicators and sum- expectations of citizens. The goal is to encourage reflection marizes what citizen-centric service delivery entails. Part II on how to best design and enhance public service delivery introduces two complementary tools designed to assess the processes so that public institutions serve their constituents performance of public institutions and the quality of public well, increase transparency and accountability, and strengthen services from the perspective of European Union citizens: a the trust of citizens in the state, thereby reinforcing the social demand-side citizen survey and a supply-side self-assessment compact. This work may also be of interest to academics and checklist for public administrators. They are meant to help pub- practitioners working on issues related to citizen-centricity. lic agencies identify gaps and areas for improvement in their service delivery mechanisms by gathering direct feedback The report was developed under the European Union regarding the experiences and perceptions of their users and Actionable Regional Governance Indicators for Public by critically examining public sector efforts to fulfill the needs Administrative Performance and Capacity Initiative, which is and expectations of citizens. The instruments complement funded by the European Commission and implemented by one another in facilitating the institutional strengthening of the World Bank Group. The overall initiative, requested by public service delivery, but they are not prescriptive or set the Economic Analysis Unit of the European Commission’s in stone. Instead, they are intended as flexible, inspirational Directorate-General for Regional and Urban Policy (DG REGIO), tools that provide an initial grid for administrations willing is structured as three sets of activities that aim to identify and to move one step closer to their citizens. Part III describes develop actionable indicators of the quality and capacity of options for customizing the instruments, which can be public administrations in European Union Member States at adapted to a variety of circumstances and service delivery the national and regional levels. types. Parts IV and V present the citizen survey and adminis- trator checklist. For illustrative purposes, appendixes A and B This document summarizes research undertaken under depict versions of the tools tailored to the delivery of admin- the set of activities dealing with citizen-centric governance istrative documents by a municipal registry office. Appendix indicators, that is, indicators that measure the capacity of C explores the advantages and disadvantages of a range of public agencies to put the needs of citizens at the center of surveying methods. their service delivery mechanisms. The other two activity sets focus on public sector governance indicators and regulatory governance indicators, respectively. vi I. Planning a Citizen-Centric Service Delivery Assessment: How and Why? What is Citizen-Centric Service relationship with citizens. The organizing principle of public service delivery must be the needs of users. This is notably re- Delivery, and What are Its flected in the Sustainable Development Goal 16.6, which aims Benefits? to “develop effective, accountable and transparent institu- tions at all levels,” as well as indicator 16.6.2, which proposes High-quality service delivery requires a sound understand- to measure the “proportion of the population satisfied with ing of citizens’ expectations, experiences, and key drivers of their last experience of public services.” satisfaction, as well as a policy framework that places citizens at the center of decision-making processes rather than at the Citizen-focused service delivery indicators measure the periphery. Citizen-centric service delivery indicators focus extent to which the needs and voices of citizens are con- on the collection of data that can help governments be- sidered during the various stages of public service design, come better at what they do—deliver services to citizens in a delivery, and evaluation/review. In a citizen-centric service responsive and equitable manner. No political elite can build delivery system, the main imperative is not to fit operational a sustainable and just environment, where institutions foster structures and processes to the requirements of government inclusive economic growth and higher standards of living departments, but to serve citizens—who are considered the for all segments of society, without a constructive, two-way main stakeholders. Emerging literature suggests that working Box 1.1. Citizen-Centric Service Delivery Indicators: What They Are Not Citizen-centric indicators are an essential component of assessing the quality of governance and public service delivery, but they only provide information about a segment of the elements comprising effective and fair governance systems. They should therefore be considered a complement to other indicators, such as datasets related to public financial management, public investment management, tax administration, procurement, human resource management, innovation and competi- tiveness, justice and rule of law, public information systems, and regulatory governance, among others.a Given their focus on public sector integrity, some close synergies exist between citizen-centric indicators and indicators related to anticorruption, transparency, and accountability. a. For a more in-depth consideration of regulatory and public sector governance indicators, refer to the other reports produced under the European Union Actionable Regional Governance Indicators for Public Administrative Performance and Capacity initiative: Actionable Regulatory Governance Indicators for EU Regions and Public Sector Governance Indicators for EU Regions. 1 Box 1.2. Client, Beneficiary, User, and Citizen: A Clarification of Terminology While the terms consumer or customer are often used to describe recipients of private sector goods or services, the terminol- ogy used to describe people receiving services provided by the public sector is debated. Commonly used terms include client, which emphasizes the fact that the services are being provided by a professional entity; user, which reflects the process of using a given service as well as possibly conveying the concept of a continuous enjoyment of a right; and beneficiary, although this term implies that someone derives an advantage from something and could thus be construed as a passive recipient with a relational weakness to the public sector as benefactor. The term preferred in this report is citizen, which refers in a broad sense “to all people in a society or country in an inclusive and nondiscriminatory way.” The term is used similarly in the Strategic Framework for Mainstreaming Citizen Engagement in World Bank Group Operations (World Bank 2014: 7), which understands citizens to be the ultimate clients of government, an approach slightly wider than the strict legal definition, according to which they are the legally recognized subjects or nationals of a state. For this report, citizen best reflects the notion of a social contract between those who govern and those who are governed. The term is meant to encompass a broad variety of people that may be impacted by the delivery of public services, potentially including foreign nationals, refugees, undocumented migrants, and others. toward a more citizen-centric system allows public adminis- based on citizens’ perspectives and empirical evidence. At the trations to increase their efficiency, thanks notably to early same time, tensions may arise when trying to shape service or immediate feedback mechanisms for taxpayer-funded delivery processes in an inclusive way, in a manner primarily services (World Bank 2015), and “flatter, agile, streamlined directed at problem-solving, and through techniques that and tech-enabled” practices (World Economic Forum 2012). foster greater agility and adaptability: politics can influ- In the reverse case, governments run the risk of seeing re- ence the willingness of governments to use citizen-focused sources being diverted or misallocated, thereby diminishing techniques. Agency staff and administrators must be cogni- the quality of public service delivery and undermining trust in zant of the role that power dynamics can play at any given public institutions. time. Despite that, proactive learning and the collection of user feedback can support entities in successfully dealing Putting citizens at the heart of public institutions provides with the challenges of complexity and evolving behaviors. a twofold benefit: it makes public administrations more Measuring citizen satisfaction and preferences on a regular efficient and increases citizens’ satisfaction and trust in gov- basis can help public managers monitor public sector per- ernment. Citizen-centric service delivery implies that policy formance over time, continuously improve service delivery, makers better understand the needs of and key drivers of sat- and measure the impact of reforms and service-improvement isfaction for citizens, and that they are in a position to “iden- activities on end users, ultimately allowing for a more citi- tify sub-groups of users and needs or gaps in accessibility” zen-centric allocation of time and resources (HM Government (OECD 2013). This, in turn, can enable public sector entities to 2007) that can result in a higher likelihood of citizens being adopt better policies and to provide more responsive services satisfied with policy outcomes. 2 | Indicators of Citizen-Centric Public Service Delivery The Evolution of the Theory and to plural and pluralist (new public governance) (UNDP 2015). This shift represents: Practice of Public Service Delivery The theory and practice of public administration in democra- “a change from models where the government owns cies have evolved significantly in recent decades. In the late inputs and processes, toward a model where the 1970s and early 1980s, the emphasis shifted from a traditional government and citizens jointly own the outcomes. In public administration perspective focused on the inner work- other words, the government moves from governing for ings of the public sector that envisioned politicians primarily citizens to governing with citizens. This also implies a as administrators toward the new public management per- shift in terms of the citizen moving closer to the center spective, which introduced an entrepreneurial point of view of governance and an evolving public sector where centered on performance and accountability for outputs. This citizens, politicians, bureaucrats and service providers newer perspective conceptualizes politicians as managers, become co-creators of public goods” (UNDP 2016a: 17). and citizens as customers. It has, in turn, been updated by the concept of network governance and an approach called new Increasingly, policy makers are placing citizens at the center public service or new public governance, which emphasizes of their considerations, with the aim “to develop policies and the links that exist between organizations within and outside design services that respond to individuals’ needs and are the public sector and seeks to build sustained cooperation relevant to their circumstances” (Holmes 2011: 1) instead of and purpose-driven coalitions across an enlarged number of letting “governments continue to design and deliver ser- governance actors, including politicians and civil servants as vices based on his/her own requirements and processes” well as individual citizens, nongovernmental organizations, (McKinsey 2015). In essence, citizen-centric service delivery and private service providers (OECD 2009; Holmes 2011). means that by default, a citizen’s interactions with gov- Under this approach, citizens are no longer merely thought ernment are based on his/her own specific identity, life of as customers or government targets. Rather, they are con- situations, and priorities—not on how the government is or- sidered to be agents in their own right, entitled to participate ganized. This idea is linked to the realization by policy makers directly or indirectly in decisions affecting them, for example, that citizens are voters and as such, can hold policy makers by co-creating policies and co-producing service design and accountable with elections. service delivery. But what does “putting the citizen at the center” mean in New institutional economics provides an alternative view practice for government service design and delivery? As of citizen-centered governance. The perspective explicitly expressed by Carson (2011), citizen-centric governments are recognizes citizens as governors or principals—rather than typically aiming to provide a service quality “inspired by both clients per the new public management perspective—and banks and hotels.” This means that they need to provide “inte- governments as undertaking collective action to advance the grated public-facing information and service delivery;” gain a public interest while minimizing transactions costs for the clear understanding of citizen segments, preferences, and life citizens who extend this mandate to the government. This events to enhance citizens’ experience in their interactions perspective highlights the importance of final outcomes in with public agencies; and provide “effective and user-friendly the governance environment and for service delivery. Public service delivery channels” (for example, through one-stop administration models have therefore evolved from static and shops or e-government options). In line with the public sector bureaucratic (traditional public administration) to competi- service value chain model, public agencies can greatly benefit tive and minimalistic (new public management), and finally from clearly identifying which of their steps and activities I. Planning a Citizen-Centric Service Delivery Assessment: How and Why? | 3 result in added-value for citizens, stakeholders, and the wider community police station. It then tested the prototype, community (Heintzmann and Marson 2005). Research carried and after collecting additional feedback, settled a plan that out in Canada, for instance, found that higher employee included free public WIFI at the police station, the installation engagement in public sector organizations translated into of an information board with useful documents for citizens, more satisfied customers and, ultimately, in greater trust and an inviting reception area staffed by an on-duty officer. and confidence in public institutions (Matheson 2009). Given As these examples demonstrate, by soliciting the views of their limited funds, capacity, and capabilities, public agencies citizens, public administrations can correct misperceptions should prioritize key drivers of improved customer satisfac- and better understand what their constituents are looking for tion. This proactive seeking of citizen feedback and the taking and what they appreciate. of actions based on this feedback will in turn often translate into organizational change because government processes In the same vein, it is important for public administrations may need to be transformed and integrated to better respond to share information and to tell their stories so citizens can to citizens’ expectations (OECD 2013; UNDP 2016a). better understand public sector challenges. There is often considerable room to set up more effective communication Instead of designing processes based on what citizen require- strategies and be more open about public resources, prior- ments are assumed to be, public in- ity areas for action, and results. By “Citizen-centric service delivery is a profession, stitutions that want to maximize their enhancing transparency and proac- not simply an objective to achieve. Training is citizen-centricity should collect data tivity, a state institution can improve paramount. You need to invest in your people. and insights from citizens and let that its image with citizens and might You need to teach administrators to be willing drive their decisions. For example, even benefit from citizen initiatives. to seek ongoing feedback in order to improve. Denmark’s MindLab is a cross-gov- Releasing government datasets in an You cannot improve what you do not measure. ernmental innovation unit that brings open format, for instance, may enable It needs to be built into organizational practices. together the municipality of Odense communities and citizen groups to You need to benchmark. You need to set goals/ with the ministries of business and develop solutions to problems related targets. You need to identify the key drivers that growth, education, and employment, ranging from waste management to will improve citizen satisfaction. In summary, and in collaboration with the ministry road repair.1 The regular publishing of invest in your people so that they can better for economic affairs and the interior, data can also help create benchmarks serve citizens.” works with a variety of stakeholders and rankings for the offices that are at the early planning stages of service Dan Batista, Executive Director of the responsible for delivering specific delivery. Notably, MindLab worked Institute for Citizen-Centered Service, Canada services; it is an inexpensive way to with citizens to test mobile devices for push for improvement. The approach completing tax returns and collected their feedback, resulting can also be based on location, similar to that adopted by in the government changing its plans, avoiding costly service some private sector actors, with scores and rankings pro- mistakes and increasing citizen satisfaction with the service. vided by online applications (such as, for example, hotels in TripAdvisor). Another example is the redesign of a local police station in Chișinău, Moldova, in 2014. With the support of interna- In short, there is room for a more intentional focus on providing tional partners, policemen, citizens, and representatives of more individualized services, better utilizing new technologies, nongovernmental organizations were invited to share their and reducing the distance to citizens through more effective views about the building renovation. Based on the insights 1. Datasets should only be publicly released if doing so does not harm the gathered, the municipality built a prototype of an improved privacy of individuals, that is, they do not include confidential personal data. 4 | Indicators of Citizen-Centric Public Service Delivery Box 1.3. “Life is about Events, Not Agencies”: Building Government Services Around the Needs of New and Expectant Parents in New Zealand SmartStart is an integrated online tool for new and expectant parents in New Zealand. This multi-agency initiative has been supported by the Ministry of Social Development, Internal Affairs, Ministry of Health, Inland Revenue, Plunket, and NZ Midwives. It features a personalized timeline and checklist based on the baby’s due date, making it easy for parents to keep track of progress and see what they need to do before and after the baby arrives. Based on the one-stop-shop principle, SmartStart provides step-by-step information and help in one place, saving the time and money of future parents. SmartStart can be used to notify the Ministry of Social Development of an upcoming birth, request an identification number for the new child, and update a Working for Families Tax Credit application. In addition, birth certificates no longer need to be purchased, saving new parents $26.50. Overall, the initiative has simplified access to and use of government services online and “is a great example of making sure New Zealanders have services designed around them for when they need them” (New Zealand Government 2017b). Sources: https://smartstart.services.govt.nz; New Zealand Government 2017a, b. and regular two-way communication and feedback as part of the most important limitations common to all available com- the public sector’s ongoing transformation efforts. Citizen- posite indexes of governance is that they fail to capture how centric service delivery is a journey; it is a long-term commit- citizens perceive the governance environment and outcomes ment that requires dedicated and determined leadership to in their own countries” (Ivanya and Shah 2000: 2). Including stay the course. It is about building capacity and, ultimately, a indicators that assess the citizens’ evaluation of governance culture of service excellence. In striving toward this goal, a pub- to complement those based on data provided by govern- lic administration can leverage limited resources in powerful ments or local experts is essential to developing citizen-cen- ways. Part II offers instruments to help with that effort. tric service delivery indicators. Citizen-focused governance can be measured by indicators How Can Citizen-Centricity that examine the extent to which citizens’ needs are consid- Be Measured? ered during the various stages of public service delivery— In order to measure the citizen-centricity of service delivery, design, implementation, evaluation, and review. The main selected issue areas and indicators need to be approached difference between a citizen-centric service delivery indicator “with the goal of bringing the citizen’s perspective forward.” and a typical service delivery indicator is that the citizen-cen- Furthermore, they have to “seek to define and quantify what tric indicator highlights the citizens’ point of view. It is not citizens judge to be good service so that service providers can uncommon for government administrations to perceive understand citizens better” (O’Connell 2000: 53). Ivanya and themselves differently than they are seen by citizens or even Shah (2010: 2–4) highlight that existing primary indicators of to produce data that suggest high-quality service delivery governance quality tend to focus on the governance environ- that do not correspond to citizens’ perceptions of the admin- ment, that is, the quality of institutions and processes, rather istrative efficiency of the government. For example, officially, than on governance outcomes, especially in terms of quality it may take only three days to deliver a given document, but of life enjoyed by citizens. According to their study, “one of from the citizens’ perspective, the whole process may in fact I. Planning a Citizen-Centric Service Delivery Assessment: How and Why? | 5 Box 1.4. Measuring Citizen Satisfaction: Caveats Public administrations should be aware of the limitations inherent in measuring citizen satisfaction. They include: n Expectations are heavily dependent on context. In some models, satisfaction is viewed as the difference between expectation and experience, and thus higher satisfaction rates could be more linked to lowered expectations among citizens than to improvements in service delivery. n What citizens express as being important in a survey and what actually drives their overall satisfaction may be different. In the airline industry, for example, users often assert that safety is their prime concern. However, when satisfaction levels for specific items are related to overall satisfaction, the presence or absence of blankets is in fact a much more marked driver of satisfaction. Van Ryzin and Immerwahr (2004) tested this idea in the public sector using stated citizen satisfaction for individual services in New York City and statistically derived satisfaction. Their study revealed that improvements in “the top four stated-importance services (fire, police, schools, and garbage)” would still not lead to the same gains in citizen satisfaction as an improvement in “street cleanliness (clean) alone,” and “the statistically derived key services explain(ed) much more variation in overall citizen satisfaction than (did) the services explicitly stated as important.” Source: Van Ryzin, and Immerwahr 2004: 224–25. be much longer and more cumbersome than that because Based on these observations, a dual assessment is recom- often the “three days” does not include time required to mended to draw a comprehensive picture of an organiza- request, collect, and receive the additional documents and tion’s citizen-centricity. Data should be directly collected certificates needed to submit an application. Citizens may from citizens regarding their personal experiences with and also differ from the public administration regarding what they evaluation of public service delivery. This is essential to help consider to be areas of priority. public agencies understand the public’s priorities and how to address them, while also illuminating and expanding informa- However, citizens are not in a position to evaluate the back-of- tion that would otherwise be unavailable or less well reported fice processes that are integral to public service delivery. in government statistics.2 In addition, the public agency’s Some of the efforts made by public administrations to efforts to reach out to citizens and improve their experience streamline their business processes or to put in place incen- with public services should be reflected through self-reported tive structures for customer-facing staff, for example, may not data. Although government records cannot, by themselves, be visible to the front-end user. Similarly, it may be difficult for provide a sufficiently reliable and comprehensive analysis of a citizen to evaluate the complex inter-ministerial cooperation governance quality, administrative data provide important involved in the delivery of certain administrative documents information on the activity and services of public entities or to get an overall sense of the scope and number of trans- actions carried out by a given institution. 2. Complementary practices include a range of social accountability tools that put citizen experiences and perceptions at the core of the assessment, such as citizen report cards, citizen charters, community scorecards, and interactive community mapping. 6 | Indicators of Citizen-Centric Public Service Delivery that could not be measured by simply using surveys of the of governance (Grandvoinnet, Aslam, and Raha 2015). The general public. Each type of data complements the other, and survey focuses on the demand side of governance by ask- they each have strengths and weaknesses. ing citizens for direct feedback about the performance of a given public entity with which they have been interacting; the In the next section, two complementary instruments are checklist examines the supply side of governance—the capac- introduced—a citizen survey and a checklist for public admin- ity of the state and public systems in place to serve citizens. istrators. The instruments focus on the receiver and the pro- The two instruments provide a basic grid that administrations vider of public services, respectively. This approach reflects can use to evaluate the level of citizen-centricity of their ser- recent literature that has emphasized the importance of vice delivery processes. bridging the gap between the demand- and the supply-side I. Planning a Citizen-Centric Service Delivery Assessment: How and Why? | 7 II. The Citizen Survey and Public Administrator Checklist Objectives and Structure Both the citizen survey and the administrator checklist broadly follow the steps of the citizen’s journey in seeking of the Instruments information or receiving a service from a government agency. The citizen survey and the administrator checklist are ready- Each instrument begins with a section that asks for basic to-use tools for willing governments at the national, regional, information about the respondent, followed by questions or local level to measure and analyze: divided into four main areas—access, user-centeredness and responsiveness, quality and reliability of service delivery, and 1. How well they are performing from a citizen’s public sector integrity; and each concludes with a section for perspective, by capturing the experience and final comments, which is designed to capture any additional perceptions of citizens in their respective constit- suggestions or opinions of citizens and administrators regard- uencies; ing priority areas needing improvement. 2. How well they are performing from their own perspective, by capturing efforts made by the The tools echo each other both in terms of structure and public agency to provide outstanding service types of questions. Any agency seeking to evaluate its delivery; and citizen-centricity should therefore complete the supply-side 3. Any discrepancies between the two. administrator checklist and administer the demand-side citi- zen survey to capture any discrepancies between the govern- The questions are designed to solicit actionable information ment’s perspective and that of its citizens, and it should then for policy makers, including clear indications of areas needing subsequently address them. The checklist includes frequent more attention and the identification of “low-hanging fruits”— prompts to compare the results of the self-assessment with easy actions that could improve a citizen’s experience with the views expressed by citizens on the same topic. public service delivery. Importantly, any information collected through the assessment, particularly the survey, will be considerably more valuable if it is consistently reported back What Do the Components Capture? to users. Such an approach would make citizens realize that The four key areas of the checklist—access; user-centered- their input is being considered and would provide admin- ness and responsiveness; quality and reliability of service de- istrators with a benchmark against which changes can be livery, and public sector integrity—were selected based on an implemented. It would also ensure that the assessment is not in-depth literature review of existing indicators of citizen-cen- merely an extractive exercise but rather a two-way dialogue tric governance and their areas of focus. A paper published by with an in-built accountability component. PriceWaterhouseCoopers’ Public Sector Research Centre, for 8 example, identifies speed of service delivery, engagement, channels for service delivery because they “can facilitate responsiveness, value for money, integration, choice, and access to a wide range of users and provide greater conve- personalized experience as seven key areas where improve- nience, while also reducing costs for all involved, including ments can be undertaken to enhance customer experience governments” (OECD 2013: 154). Further, entire segments of and outcomes (PwC 2007: 9). In addition, the Organisation for populations, including the most impoverished citizens, recent Economic Co-operation and Development (OECD) publishes migrants, and youth, are increasingly accessing online ser- biennial Government at a Glance reports that provide indi- vices through their mobile phones rather than on computers. cators comparing the political and institutional frameworks of government across OECD countries. The 2013 report’s “set An example of how the citizen survey and the administrator of inter-related process components that encapsulate what checklist address the same themes with slight variations is citizens expect from government” includes openness and in- the set of questions regarding e-government. The citizen clusiveness, responsiveness, reliability, integrity and fairness survey focuses on satisfaction with the interface, including (OECD 2013: 29). The 2015 report goes one step further to ease of use, presentation, and clarity of the website; while identify access, responsiveness, and reliability and quality as the administrator checklist includes a self-assessment of the three essential pillars of a framework to measure public the website as well as considerations related to the use of services. The access pillar includes issues such as affordabil- e-government features, privacy and identity management, ity, geographic proximity, and accessibility of information. The cybersecurity, and the collection of online metrics. responsiveness pillar includes topics such as the citizen-cen- tric approach, the matching of services to special needs, and Criteria related to user-centeredness and responsiveness timeliness. And the pillar of reliability and quality includes evaluate whether public agencies explicitly recognize, adapt, issues regarding the effective delivery of services and out- and respond without delay to the various needs of citizens. comes, the consistency of service delivery and outcomes, and A citizen-centric approach implies that the service provider security and safety (OECD 2015: 169). These elements are all offers solutions that are tailored to various citizen segments incorporated in the two instruments presented here. instead of supplying a “one-size-fits-all” product. This is mea- sured in the citizen survey by asking citizens if they feel they Access is a decisive performance criterion for citizen-centric received personalized service and if they believe that public service delivery. It examines a public agency’s capacity to cre- services are attentive to their needs. The corresponding ate and tailor communication and service delivery channels section in the administrator checklist explores if citizens are that answer the needs of citizens. Citizens may face myriad involved in service design and if the public agency occasion- barriers to access, such as difficulties in identifying and/or ally contacts them proactively. Each instrument includes a contacting a relevant interlocutor; an insufficient number dedicated section examining the issue of responsiveness of, or inadequate access channels; excessive waiting times because timeliness “particularly affects citizens’ confi- or lack of people with whom to interact; and inconvenient dence in the ability of public services to meet their needs” opening times, geographic location, and physical layout of (OECD 2013: 158). The survey questions focus on the actual facilities—an issue of particular concern to users with special time it takes for a citizen’s request to be answered, if time needs, including people with disabilities, nonnative speakers, frames are clearly communicated, and the citizen’s idea of and minorities. “Such barriers can decrease awareness of an acceptable standard for time-bound service delivery. The eligibility or existence of services or deter potential recipi- administrator checklist prompts the agency to describe its ents” (OECD 2013: 150). An examination of accessibility-related current service delivery standards, the extent to which they issues includes the consideration of the presence of online are being respected, whether communication on time frames II. The Citizen Survey and Public Administrator Checklist | 9 Table 2.1. Summary of Issues Explored in Citizen Survey and Administrator Checklist Citizen Survey Administrator Checklist Access • Finding the relevant contact information • Providing clear contact information • Choosing the most convenient access channel • Providing various access channels in line with citizens’ • Getting in touch with the administration preferences • Using e-government/digital procedures • Interacting with citizens • Providing e-government services/digital procedures User-Centered Service Delivery and Responsiveness • Receiving personalized service • Providing a personalized service • Receiving timely service • Providing timely service • Service delivery standards in line with expectations • Setting service delivery standards in line with expectations Reliability and Quality of Service Delivery • Interacting with staff • Interacting with citizens • Receiving clear, high-quality information • Providing clear, high-quality information • Completing the procedure • Completing the procedure • Reaching a satisfactory outcome • Reaching satisfactory outcomes for citizens Public Sector Integrity • Interacting with a transparent, corruption-free, and • Embodying a transparent, corruption-free, and effective effective public sector public sector • Accessing feedback and complaint handling mechanisms • Providing feedback and complaint-handling mechanisms • Benefiting from effective interagency cooperation • Guaranteeing effective interagency cooperation Final Comments • Priority areas • Priority areas • Unmet needs • Unmet needs • Comments, suggestions, and questions • Comments, suggestions, and questions with citizens is systematized, and whether clients have been request, while the administrator checklist asks agencies if consulted about their expectations regarding timely service they have conducted any usability testing, experimented with delivery. customer journey mapping, or simplified their processes and procedures in the past year. Measuring the quality and reliability of service delivery to users is a fundamental element of evaluating an organiza- Any citizen-centric evaluation of service delivery performance tion’s citizen-centricity, including the quality of interaction must include a question of whether or not the citizens feel with staff, such as politeness, fairness, helpfulness, knowl- they are treated fairly and equitably, which is related to the edge, and competence; the provision of clear, relevant, com- broader topic of public sector integrity. A dedicated sec- prehensive information; affordability/financial accessibility; tion in both the survey and checklist consider the citizen’s ease of process; and satisfaction with outcomes. The citizen experience with corruption, standards of conduct, ethical survey asks users if they were satisfied with the result of their principles and practices applicable to the institution as well 10 | Indicators of Citizen-Centric Public Service Delivery as the existence of mechanisms encouraging transparency, being subjective and because a respondent’s evaluation can accountability, and public participation, including feedback be impacted by unrelated factors, such as the political and and complaint-handling systems to guarantee the fairness, economic environment, their trust in or sympathy toward the consistency, and quality of services and offer opportunities government, or recent events. An illustrative example of an for redress in cases where the rights of citizens have been evaluation-based question (based on OECD 2012) is: infringed upon. How would you rate the level of service provided by the Table 2.1 provides a summary overview of issues examined by tax office staff with whom you had contact over the past the survey and checklist, including aspects where they mirror 12 months? one another and areas where they differ. o Excellent o Good o Neither good nor poor o Poor o Very poor Question Types Some surveys include scenario-based questions, which pres- In addition to collecting hard data, there are multiple types ent the respondent with a hypothetical scenario and several of questions that are helpful in developing indicators of a options for answers. For example, consider the following service delivery system’s citizen-centricity. The types of ques- scenario (World Justice Project 2016): tions most frequently used in surveys are experience-based, evaluation-based, and scenario-based. Assume that a high-ranking government officer is taking government money for personal benefit. Also assume Experience-based questions ask respondents to relay their that one of his employees witnesses this conduct, re- own, lived experiences. They are often used in surveys related ports it to the relevant authority, and provides sufficient to the occurrence of corruption or bribery. Their advantage is evidence to prove it. Assume that the press obtains the their ability to provide objective, reliable data and to gener- information and publishes the story. Which one of the ate evidence regarding how citizens actually interact with following outcomes is most likely? public institutions. Their disadvantage is that they only allow for a limited scope of enquiry. An illustrative example of an (1) The accusation is completely ignored by the experience-based question borrowed from the World Justice authorities Project’s Rule of Law Index Report (2016) is: “Did you have to (2) An investigation is opened, but it never reaches any pay a bribe (or money above that required by law) to obtain conclusions. the information?” (3) The high-ranking government officer is prosecuted Evaluation- or perception-based questions assess the subjective and punished (through fines, or time in prison). way that respondents acquire, interpret, and organize infor- mation. Instead of assessing facts or knowledge, they con- Scenario-based questions are not as frequently used in sider a respondent’s personal assessment of a given topic. A surveys. While they invite frank responses due to their non- typical example of evaluation-based questions are satisfaction threatening nature—that is, the respondent may feel free to ratings. These questions can provide insights into the general speak openly because the scenario is hypothetical—their use- views of respondents and produce information about situa- fulness is debatable because a respondent may not be able tions for which objective and comparable data are difficult to to identify with a provided scenario and because they only obtain. However, evaluation-based questions are criticized for provide a general sense of the perceived likelihood of events II. The Citizen Survey and Public Administrator Checklist | 11 occurring. Furthermore, it takes a long time to read scenarios 2.3—Interacting with Citizens—for example, administrators are to respondents and requires greater levels of comprehension asked to self-evaluate their agencies’ performance regarding and memory on the part of the respondents. the ease with which citizens can contact them. Several ques- tions explore past events related to the agency that require In addition, hard data provides insights that can enrich an an answer of “yes” or “no,” such as: “Has the agency ever overall picture. They can be measured, traced, and validated. conducted accessibility testing of its services?” There are also At the same time, they do not provide an understanding of questions that call for the collection of hard data, such as: “In situational nuances. Turkey’s annual report of the office of the past 12 months, how many citizens have contacted the the prime minister provides an example of hard data: “In 2015, agency using the following channels?” the Public Officials Ethical Committee received 126 applica- tions from citizens, 13 of which dealt with conflict of interest In addition, concrete actions to improve service delivery can claims” (Republic of Turkey 2016). Hard data allow us to be derived fairly easily from the provided answers. The survey quantify results and initiate comparisons over time or across and checklist are intended to be used by public institutions to different actors. However, they are primarily focused on flag areas that are lagging in terms of citizen-centric service outcomes rather than processes, and are therefore best used delivery and help to improve them. Therefore, the questions in combination with other types of questions. And because were selected based on their potential to highlight the expe- much data originates from units with a potential interest in riences and perceptions of citizens and administrators and to inflating accomplishments, a healthy skepticism of their valid- allow the identification of actionable areas within the public ity and reliability is warranted. agency’s control. The citizen survey combines experience-based and evalua- All of the survey and checklist questions have either been tion-based questions, taking advantage of both methodolo- used before or have been adapted from a variety of sources gies’ strengths. For example, Section 4—Reliability and Quality exploring the interactions of citizens with public agencies, of Service Delivery—is composed exclusively of questions including international questionnaires such as the Open asking citizens about the extent to which they strongly agree, Government Index; national-level questionnaires such as those agree, disagree, or strongly disagree with a number of state- administered by the governments of Canada, France, Ireland, ments related to interaction with staff, quality of information and New Zealand (CCMD 1998; SGMAP 2015; DPER 2015; New provided, procedures, and outcomes. Section 5—Public Sector Zealand Government 2015); European Union-wide surveys Integrity—asks several experience-based questions, such as if such as the European Commission’s E-government Benchmark, the respondent has been asked to do a favor, give a gift, pay Eurobarometers, and the European Quality of Life Survey (EC an official a bribe, or provide feedback about a received ser- 2012, 2014; Eurofound 2012); and subnational indices such as vice. The survey does not include any scenario-based ques- the European Quality of Government Index, the International tions per se, but there are a few questions in section 5 that Republican Institute’s Ukraine Municipal Survey, the Vietnam ask if the respondent would know where to file a complaint or Provincial Governance and Public Administration Performance where to report corruption in a hypothetical scenario. Index, Transparency International Slovakia’s transparency and openness ranking of cities and regions, and the Center for The administrator checklist also contains a mix of eval- the Study of Democracy’s Monitoring Anticorruption Policy uation-based and experience-based questions, but it is Implementation tool (Charron 2013; IRI 2015; UNDP 2016b; more focused on the collection of hard data. In Section Transparency International 2012; CSD 2015). 12 | Indicators of Citizen-Centric Public Service Delivery Answer Types positive or negative attitudes toward a specific issue. The Likert scale typically provides for five different possible Except for Section 6—Final Comments, all of the citizen survey answers: strongly agree, agree, uncertain, disagree, or strongly questions used are closed-ended to facilitate data collection: disagree. This type of scale allows for the delineation of how respondents are asked to indicate yes or no or choose from respondents feel about a given area. One advantage of the a limited set of possible answers. This is also true for the Likert scale is the ease with which it can be administered. It supply-side checklist, although a number of its questions ask also allows for greater differentiation than does a simple yes/ for additional information, comments and detail as well. A no or agree/disagree option because perspectives can be ex- summary of the advantages and disadvantages of open- and pressed along a continuum. The use of a Likert scale through- closed-ended questions is presented in table 2.2. out a survey allows respondents to choose the gradation of their opinions and facilitates its administration, especially if it Several questions in the survey and the checklist make use is self-administered. of a Likert measurement scale—a list of items expressing Table 2.2. Advantages and Disadvantages of Open- and Closed-Ended Questions Type of Question Advantages Disadvantages Closed-ended • Provides uniform responses • Respondents can neither clarify nor further • Easy to administer; saves time and intellectual express their positions energy of the respondent and the interviewer • Respondents are somewhat passive and not • Helps respondents concentrate on the aspects encouraged to reflect important to the researcher • If respondents have nothing to say, they may • Easy to code; significantly reduces transcription give a nonreasoned or casual answer errors • Can prevent original contributions that do not • Easy to analyze fit into preset categories • Allows for comparisons and quantification • More likely to produce fully completed questionnaires while avoiding irrelevant responses Open-ended • Allows respondents to express their answers in • Requires additional work and incurs additional their own words costs during the data cleaning, analysis, and • Guarantees greater freedom and spontaneous coding phases answers • Cognitive effort required from the respondent • Invites respondents to share understandings, and greater potential for distortion by the experiences, opinions, and interpretations interviewer when recording the response II. The Citizen Survey and Public Administrator Checklist | 13 III. Customizing the Citizen-Centric Service Delivery Assessment The citizen survey and the administrator checklist can be cus- service delivery process or several? For example, the assess- tomized to fit a public entity’s particular mandate and charac- ment design will vary depending on if an agency is trying to teristics relating to service delivery. The questions included bolster an argument (regarding for example how to reorga- are a first cut from a much wider range of questions that can nize certain business processes) or just trying to get a sense be considered. They are intended as a basic framework and of how its clients perceive its service delivery. If, for instance, repository of key aspects for consideration. The four main a social security department is seeking empirical evidence areas examined—access, user-centeredness and responsive- to support the claim that the process of delivering medical ness, quality and reliability, and public sector integrity—are insurance cards must be remodeled, it might choose to start relevant across various types of public bodies with a service with a citizen survey narrowly focused on the perceived ease delivery mission. The survey and checklist can be adapted and timeliness of the service. If a municipality wants to get an to provide tailor-made indications of public service delivery overall sense of how effective its service delivery is in a given quality. The assessment should be tailored to the local and area, it might start with a self-assessment, and then conduct national context and to normative regulations. a comprehensive online survey of citizens who have recently requested or used the service in question. If a coordinating agency considers carrying out a citizen-centric service deliv- Objectives ery assessment, it might also consider using the instruments An agency seeking to implement the citizen-centric service to encourage yardstick competition among its subordinate delivery assessment should begin by clearly identifying its bodies. goals, which should then guide the broad parameters of the survey and checklist. What are the service delivery questions or issues that need answers? What issues is the assessment Number and Sequencing of intended to address? What data are already available, and Questions where are there gaps in the data? What kind of analysis The implementing agency must decide the questions to focus is anticipated? What level of precision is required? What on as well as their sequencing. The length of the instruments resources are available for carrying out the assessment in can be adapted based on a cost-benefit assessment that con- terms of finances, personnel, and technology? What are the siders available resources, the burden on respondents and existing constraints? What limitations would be considered interviewers, and the usefulness of collected data. The survey acceptable in terms of target population, coverage, and example presented here is somewhat thorough; its length number of responses? Should the assessment focus on one may not be well suited to various types of administrations. 14 For example, a telephone or self-administered survey must opt for this cognitively easy answer and thus indiscriminately be shorter than a face-to-face survey. It is therefore important select the central value. Dolnicar and Grün (2014) call this for the agency to specifically choose the topics to cover and phenomenon evasion behavior. Proponents of an even scale at what depth. Given the fact that, once set up, changing a argue that respondents should be “forced” to take a stance questionnaire for additional iterations reduces or eliminates and provide a clearer indication of their views, either positive the possibility of comparing results across time, starting or negative. Others argue that providing a neutral response is small and simple with a focus on the very core aspects that a valid outcome for some questions and that it is inappropri- are of interest to the agency can be a sensible strategy. ate to push respondents into taking a side. They further stress that using a four-point scale risks producing an acquiescent An implementing agency might also consider adapting the response set, that is, reinforce an individual’s tendency to sequencing of the survey questions. As with any survey, want to be agreeable, which could distort the data. the order of items asked as well as the headings direct the attention of respondents and influence the fluidity of the The choice of an even or uneven scale should be influenced questionnaire. Agencies might consider developing a short by the survey’s objectives. Indeed, it appears that even scales introduction to the survey, outlining its goals and indicating are usually more favored by actors with political views, such the approximate amount of time the survey will take to fill as a nongovernmental organization that wants to determine out. It can also be useful to complement section changes with the quality of service delivery or a newly elected mayor who brief transition sentences that guide the respondent toward wants to assess what is going well and what is not. An even the next topic. Depending on the implementing agency’s scale can also simplify the summary and presentation of re- preference, the collection of respondents’ demographic sults because the aggregate positive or negative answers can information may be moved toward the end of the survey, be lumped together—for example, 70 percent of respondents as respondents tend to be mentally tired following a series strongly agree or agree that staff was polite to them, while 30 of analytical considerations and appreciate closing on easy percent disagree or strongly disagree. questions. Questions about age and income may also be perceived as sensitive by some respondents and discourage Another option is to use a four-point scale, plus a no opinion them from participating, so asking these questions at the end category, or to choose the type of scale based on the inter- of the survey could reduce the drop-out rate. The first ques- viewing method envisaged. Indeed, for phone interviews, tions asked should be engaging because many will decide to it is typical that respondents are given a limited number of continue or abandon the survey at an early stage. answer options, usually around four. Answer Scale Target Group and Survey Method Some types of questions allow for either an even or uneven A resource-effective way to administer the citizen survey is answer scale, such as a four- or five-point scale, respectively. through a targeted survey of citizens who have had fairly The main difference between the two is that an uneven scale recent contact with the agency (for example, over the past 12 allows for a central no preference or neutral answer option, months). For a local police office, this could mean getting in and thereby legitimizes such a position. Survey specialists touch with recent crime victims; for a public finance center, have mixed views on this topic. Some suggest that a neutral citizens who have recently visited the center to receive help option should not be provided because respondents might with their tax-related questions. Another option is to aim for a III. Customizing the Citizen-Centric Service Delivery Assessment | 15 representative sample of the target population, for example, Frequency of Enumeration through a general telephone survey of the adult population. Due to its scientific grounding, representative sampling An agency might choose to carry out a citizen survey as a carries greater weight when considering changes based on standalone initiative or include it in its existing survey vehi- survey findings. However, this method tends to be more ex- cles, depending on available infrastructure. If the agency is pensive because if service users comprise only a small share already administering other types of feedback instruments, of the population, then a large sample might be required. adding a selection of citizen-centric questions to the existing Hence, targeting the subpopulation that has recently used tools could prove to be the most cost-effective approach. one of the agency’s services—such as with an email or phone Depending on its budget, the agency might decide to con- survey after the use of a specific service—is more likely to pro- duct the survey itself or to outsource its administration to an duce useful results, is a better use of resources, and allows external agent. agencies to extract more actionable data. An agency might decide to carry out the citizen survey as a At the same time, for issues regarding access, it is important one-off initiative, for example to collect data on a specific to keep in mind that it may be necessary to include non-us- aspect of service delivery to inform its decision-making pro- ers, because users by definition have successfully accessed cess; on a periodic basis, such as annually; or continuously. a service. An implementing agency might also choose to Collecting data on a periodic basis allows for comparisons administer the survey to a focus group, that is, a small but across time and the development of benchmarks. It increases diverse group of actual users for a one-off discussion or a the relevance of the data collected, which can then be used service user panel (called a consumer panel in the private to inform evaluation processes and monitor performance. sector). Another option is the self-selection of respondents However, this approach necessitates a regular and consid- through a website- or social-media-based survey. However, erable budget commitment. Continuous surveying involves this method is biased in that it draws on the most motivated the collection of data from users on an ongoing basis. This users of the service, who are keen on participation—the might mean that every user of a given service is contacted “loudest voices”—a potentially interesting but not necessarily after a transaction and asked to provide feedback. Albania’s representative group. Ministry of State for Local Issues and Anticorruption used this approach in 2015. A text message was automatically sent out Regarding methods used to reach citizens, several options to citizens who had recently received treatment at a state- are available, which include face-to-face, telephone, postal, run hospital, enquiring if they had been asked to pay a bribe or Internet administration of the survey. A mixed interviewing (Kunicova 2015). This type of continuous or spot checking mode (using two or more of these options) is another possibil- is becoming more prevalent and is greatly facilitated by the ity, but while this approach can reduce costs associated with spread of new technologies. A variety of technology-based the interview process and improve response rates, it can be methods can be considered in numerous combinations, difficult and expensive to implement where resources are lim- depending on the goals and needs of the agency. Continuous ited. Another aspect of the survey for the agency to determine surveying, such as with follow-up emails to citizens after they is an adequate sample size. A summary of the advantages and receive a service or a customer feedback stand at the exit of disadvantages of various survey administration modes are a service center with happy- and sad-face buttons, allows presented in appendix C. for the pooling of data over time. However, this approach requires the regular use of resources and may not be practical for some organizations. 16 | Indicators of Citizen-Centric Public Service Delivery Finally, to guarantee the effective use of the survey tool, it Parts IV and V present templates for the citizen survey and ad- is strongly recommended that administrations pretest the ministrator checklist, respectively. As noted, these templates questionnaire (for example, through the use of focus groups are intended as general guidance. They offer a set of ideas or cognitive testing methods) for completeness; ease of use to evaluate citizen-centric service delivery, but they can and in terms of time required to complete, clarity of instructions should be adapted to the mandate and characteristics of the and questions, and sequence; perceptions of respondents; agency conducting the assessment. and suitability of the data collection channel. Precautions should be taken to avoid creating space or incentives for ma- Appendixes A and B provide examples of how the citizen- nipulating results, including paying close attention to the for- centric service delivery assessment tools might be used by a mulation of questions, the timing of and circumstances under municipal registry office that issues birth, wedding, and death which the survey is conducted, the data collection process, certificates. The completed checklist is accompanied by com- and the analysis of results. This is particularly vital because ments and a short summary of the insights provided by the any such manipulation would be a significant step away from agency’s self-assessment, which then guides the content of citizen-centric service delivery and would likely erode rather the citizen survey. The appendixes provide a realistic illustra- than reinforce citizens’ trust in public institutions. tion of the type of findings and recommendations that can be drawn from the instruments. III. Customizing the Citizen-Centric Service Delivery Assessment | 17 IV. The Citizen Survey Section 1: Respondent Information 1.5. Annual income before tax (in euros): r Less than 10,000 1.1. Gender: r 10,000–20,000 r Male r Female r 20,000–30,000 1.2. Year of birth: r 30,000–40,000 1.3. Highest educational attainment: r 40,000+ r Primary education Note: Suggested ranges should be adapted to the local context. r Secondary education 1.6. Postal code: Short-cycle tertiary education (e.g., higher technical, r  Note: Postal codes can help identify the degree of urbanization and the NUTS community college, technician-level training, and (nomenclature of territorial units for statistics) region of the respondent. But advanced/higher vocational training—usually two years asking respondents for their postal codes is not always necessary and can of postsecondary education) sometimes lead to nonresponses because in some countries, postal codes r Bachelor’s degree or equivalent identify individual houses. r Master’s degree or equivalent 1.7. Recent interactions with public agencies and officials. r Doctoral degree or equivalent Over the past 12 months, have you come into contact 1.4. Professional situation: with [name of agency] either for your own purposes or r Working (full-time, part-time, or self-employed) on behalf of someone else, whether in person; by phone, r Homemaker mail, or email; or on a website? r Retired r Yes r No r Unemployed 1.8. Type of interaction with public agencies or officials. r Student Why did you come into contact with [name of agency]? r Other: r I was searching for information. r I wanted to submit a question, suggestion, or complaint. r I was looking for a public service. Elaborate: Note: Agencies administering this survey can code their services and provide closed-ended answer choices to question 1.8. 18 Section 2: Access 2.3. Getting in touch with the administration After you identified the correct website/address/contact 2.1. Finding the relevant contact information person, how satisfied or dissatisfied were you with the 2.1.1. How satisfied were you with the ease of finding the following? correct website/address/contact person? 2.3.1. Ease of contacting the government entity r Very dissatisfied r Dissatisfied r Satisfied r Very satisfied r Very dissatisfied r Dissatisfied r Satisfied r Very satisfied r Do not remember r Do not remember 2.1.2. Did you approach another government agency 2.3.2. Overall waiting time to get your query answered before finding the one that could actually deal with (e.g., on the phone, at the facility, or to receive a response your enquiry? by mail or email) r Yes r No r Do not remember r Very dissatisfied r Dissatisfied r Satisfied r Very satisfied If the answer to question 2.1.2. is “yes”: r Do not remember 2.1.3. How many different agencies did you approach 2.3.3. Number of public servants required to resolve before you found the one that could actually deal with your request your enquiry? r Very dissatisfied r Dissatisfied r Satisfied r Very satisfied r 2 r 3 r 4 r 5+ r Do not remember 2.2. Choosing the most convenient access channel If your contact was in person/face-to-face, how satisfied 2.2.1. When you looked for information or came into were you with the following? contact with [name of agency], which of the following 2.3.4 Opening hours means of interaction did you use? Select all that apply. r Very dissatisfied r Dissatisfied r Satisfied r Very satisfied r In-person, face-to-face contact with public official r Do not remember r Posted letter and/or facsimile r Telephone (fixed line or mobile) 2.3.5. Time it took you to reach the facility r Email r Very dissatisfied r Dissatisfied r Satisfied r Very satisfied r Website r Do not remember r Tablet/smartphone applications 2.3.6. Physical layout of the facility r Social media r Very dissatisfied r Dissatisfied r Satisfied r Very satisfied 2.2.2. If you were to come into contact with [name of r Do not remember agency] again in the future, what would be your preferred 2.3.7. How many public servants did you interact with? channel to interact? Select one. r 1–2 r 3–4 r More than 4 r Do not remember r In-person, face-to-face contact with public officials r Posted letter and/or facsimile r Telephone (fixed line or mobile) r Email r Internet/website r Tablet/smartphone applications r Social media IV. The Citizen Survey | 19 2.4. Using e-government/digital procedures 2.4.6. Information/documents available on website If you did not check the boxes for “email,” “website,” “tablet/ r Very dissatisfied r Dissatisfied r Satisfied smartphone applications,” or “social media” in question r Very satisfied r Do not remember 2.2.1: 2.4.7. Clarity of online forms 2.4.1. Why have you not used email, websites, r Very dissatisfied r Dissatisfied r Satisfied tablet/smartphone applications or social media r Very satisfied r Do not remember to contact public agencies or officials? 2.4.8. Instructions, support, and/or help Check all that apply. functionalities I was unaware of the relevant website or online service. r  r Very dissatisfied r Dissatisfied r Satisfied I do not know how to use/am not familiar with online tools. r  r Very satisfied r Do not remember r I prefer personal contact. 2.4.9. Did you encounter any technical problems while Things get done more easily and/or more quickly through r  using the website/application? other channels. r Yes r No r Do not remember I am worried about the protection and security of personal r  If yes, please explain: data on the Internet. 2.4.10. To what extent do you agree or disagree The relevant services will require personal visits or paper r  with the following: I am confident that any personal submission anyway. data I provide to government agencies is securely r Other (please specify): managed/properly protected. 2.4.2. If it were possible, would you like to do every- r Strongly disagree r Disagree r Agree r Strongly agree thing with [name of agency] online? r Yes r No Section 3: User-Centered Service Delivery If you checked “website” and/or “tablet/smartphone and Responsiveness applications” in question 2.2.1: 3.1. Receiving personalized service Thinking of the most recent contact you had online using a personal computer, laptop, To what extent do you agree or disagree with the following mobile device, or tablet, what was your level of statements? satisfaction/dissatisfaction with the following? 3.1.1. The service I received took into account my individual 2.4.3. Ease of navigating website/application circumstances and preferences. r Very dissatisfied r Dissatisfied r Satisfied r Strongly disagree r Disagree r Agree r Strongly agree r Very satisfied r Do not remember 3.1.2. Based on my most recent interaction, I would say 2.4.4. Presentation of website/application that public services are attentive to their users’ needs. r Very dissatisfied r Dissatisfied r Satisfied r Strongly disagree r Disagree r Agree r Strongly agree r Very satisfied r Do not remember 2.4.5. Ease of downloading material r Very dissatisfied r Dissatisfied r Satisfied r Very satisfied r Do not remember 20 | Indicators of Citizen-Centric Public Service Delivery If you checked “disagree” or ”strongly disagree” in question 3.3. Service delivery standards in line with expectations 3.1.1 and/or 3.1.2: 3.3.1. If you call with a request, what is a reasonable 3.1.3. Why were you dissatisfied? Check all that apply. amount of time to wait before speaking with a government The government agency offered you a generic solution that r  representative? did not match your specific circumstances. r None r 30 seconds r 1 minute r 2 minutes r 3 minutes The government agency failed to treat you with proper r  r 4 minutes r 5 minutes r Longer than 5 minutes respect and empathy. 3.3.2. If you call with a request, what is the maximum r Other, please explain: number of people you should have to deal with? 3.2. Receiving timely service r 1 r 2 r 3 r 4 or more 3.2.1. How much time passed between the moment 3.3.3. If you leave a voice mail message at 10:00 a.m., what you requested a service and the moment you considered is a reasonable amount of time to wait before receiving a your problem solved? return call? r Up to 5 minutes r Up to 15 minutes r Up to 30 minutes r 1 hour r 4 hours r Same day r Next day r Within 3 days r Up to 1 hour r Up to half a day r Up to a day r Up to 1 week r Within 1 week r Longer than 1 week r Up to 2 weeks r Up to 1 month r Up to 3 months 3.3.4. If you visit a government office, what is a reasonable r Up to 6 months r Up to 1 year r Not yet resolved amount of time to wait in any line? r Do not remember r 1 minute r 2-4 minutes r 5–9 minutes r 10–14 minutes To what extent do you agree or disagree with the following? r 15–19 minutes r 20–24 minutes r 25–30 minutes r More than 30 minutes 3.2.2. It was clear to me how long the process would take to complete. 3.3.5. If you visit a government office, what is the r Strongly disagree r Disagree r Agree r Strongly agree maximum number of people you should have to r Not applicable deal with? r 1 r 2 r 3 r 4 or more 3.2.3. The service was performed within the indicated time frame. 3.3.6. When you write or send paper documents to a r Strongly disagree r Disagree r Agree r Strongly agree government office, what is a reasonable amount of time to r Not applicable wait before receiving a mailed reply? r 1 week r 2 weeks r 3 weeks r 4 weeks or more 3.2.4. I was satisfied with the time it took to get an answer to my initial query. 3.3.7. When you email or send documents electronically to r Strongly disagree r Disagree r Agree r Strongly agree a government office by 10:00 a.m., what is a reasonable r Not applicable amount of time to wait before receiving an electronic reply? 3.2.5. Overall, I was satisfied with the amount of time it r 1 hour r 4 hours r Same day r Next day r Within 3 days took to get the service/to deal with my query. r Within a week r Longer than a week r Strongly disagree r Disagree r Agree r Strongly agree r Not applicable 3.2.6. How many times did you have to get in touch with [name of the agency] to follow-up on your request? r None r 1 r 2 r 3 r 4+ r Do not remember IV. The Citizen Survey | 21 Section 4: Reliability and Quality of To what extent do you agree or disagree with the following Service Delivery statements? 4.3.2. The process was straightforward and easy to 4.1. Interacting with staff understand. To what extent do you agree or disagree with the following r Strongly disagree r Disagree r Agree r Strongly agree statements? 4.3.3. The succession of steps in the process was logical. 4.1.1. Staff were polite to me. r Strongly disagree r Disagree r Agree r Strongly agree r Strongly disagree r Disagree r Agree r Strongly agree 4.3.4. The process was easy to complete. 4.1.2. Staff treated me fairly. r Strongly disagree r Disagree r Agree r Strongly agree r Strongly disagree r Disagree r Agree r Strongly agree 4.3.5. The process required little paperwork. 4.1.3. Staff paid extra attention to me and went out of their r Strongly disagree r Disagree r Agree r Strongly agree way to get me what I needed. 4.4 Reaching a satisfactory outcome r Strongly disagree r Disagree r Agree r Strongly agree 4.4.1. Did you ultimately receive the service you 4.1.4. Staff were knowledgeable/competent regarding the requested? subject matter. r No, not at all r Partially r Yes, completely r Strongly disagree r Disagree r Agree r Strongly agree r The issue is still pending 4.2. Receiving clear, high-quality information 4.4.2. Thinking about the entire experience, how satisfied To what extent do you agree or disagree with the following were you with the service you got? statements? r Very dissatisfied r Dissatisfied r Satisfied r Very satisfied 4.2.1. I received high quality information/advice. 4.4.3. Was the service provided better or worse than r Strongly disagree r Disagree r Agree r Strongly agree you expected? 4.2.2. I received all the information/advice I needed in r Much worse r Worse r Better r Much better one interaction. 4.4.4. Would you recommend using this service to another r Strongly disagree r Disagree r Agree r Strongly agree citizen? 4.2.3. The information/advice was provided in clear, simple r No, not at all r Not really r Yes, probably r Absolutely language. r Strongly disagree r Disagree r Agree r Strongly agree 4.3. Completing the procedure 4.3.1. How difficult was it for you to cover the cost of receiving the service? r Very difficult r A little difficult r Fairly Easy r Very easy 22 | Indicators of Citizen-Centric Public Service Delivery Section 5: Public Sector Integrity 5.2. Accessing feedback and complaint handling mechanisms 5.1. Interacting with a transparent, corruption-free, and 5.2.1. Were you asked to evaluate the service you received? effective public sector r Yes r No r Do not remember 5.1.1. Thinking about your interactions with [name of 5.2.2. If you wanted to submit a complaint about the agency] over the past 12 months (for any service), did you public service (for example, to report a case of unethical or anyone you know have to do a favor, give a gift, or pay behavior, favoritism, poor service delivery, or unjust an official extra money to get a service or document? outcome), would you know where to file it? r Yes r No r Yes r No r Not entirely sure If yes, please indicate the approximate monetary value of this favor, 5.2.3. Have you ever wanted to complain about a service gift or bribe: you received from this government entity? 5.1.2. Do you think any of the following would have r Yes r No assisted you in receiving easier service from this agency? If the answer to question 5.2.3 is “no,” proceed to 5.2.7. Check all that apply. Better connections to officials who work at or run the agency r  If the answer to question 5.2.3 is “yes”: Better prior information about what was required from you r  5.2.4. Did you submit an official complaint? Better prior information about your rights and what you are r  r Yes r No entitled to If the answer to question 5.2.4 is “yes”: r Other, please indicate: 5.2.5. How did you complain? In your opinion, how serious are the following problems of r Face-to-face r By letter or facsimile r By email the public agency with which you interacted? r By phone/calling a hotline r On agency website 5.1.3. Corruption—the use of public office for private gain, r Through a nongovernmental organization (NGO) which can take many forms, such as bribery, extortion, r On social media fraud, embezzlement, collusion, abuse of discretion, r Other (specify): favoritism, gift giving, nepotism, cronyism, and patronage If the answer to question 5.2.4 is “no”: r Insignificant r Not very significant r Quite significant 5.2.6. Why did you not register a complaint? r Very significant r I did not know how. 5.1.4. Lack of a service culture among public sector staff It would be pointless/it would not lead to any change. r  r Insignificant r Not very significant r Quite significant r It would take too much time/effort. r Very significant I was afraid it could have negative consequences for me r  5.1.5. Lack of opportunities for citizens to participate in the r Other (specify): design of policies and services r Insignificant r Not very significant r Quite significant r Very significant IV. The Citizen Survey | 23 If you were to experience or witness a case of corruption, Section 6: Final Comments would you: 6.1. In your view, what should the public sector’s priority 5.2.7. Be willing to report it? area be in terms of improving public service delivery? r Yes r No r Not entirely sure Check one. 5.2.8. Know where to report it? r Simplify access to services r Yes r No r Not entirely sure r Improve quality of services r Reduce cost of services 5.3. Benefiting from effective interagency cooperation r Improve staff behavior Thinking now about all the times you have personally r Improve timeliness used or had contact with [name of public service provider] r Develop and/or improve online services over the last 12 months, have you encountered any of the r Reduce corruption following? r Reduce red tape and paperwork 5.3.1. The agency asked you to provide information it r Other (please specify): was supposed to have already. 6.2. Is there any demand or unmet need regarding public r Yes r No services that you would like to bring to the attention of 5.3.2. The agency provided you with information any particular public agency? Please explain. contradicting something you had heard or read elsewhere. r Yes r No 6.3. Do you have any additional comments, suggestions, 5.3.3. The agency redirected to another office or gov- questions, or concerns you would like to share? ernment agency with little positive outcome for you. Please elaborate. r Yes r No 5.3.4. The agency contacted you proactively about a useful service or an information you might need in the future. r Yes r No Thank you for completing the questionnaire. 24 | Indicators of Citizen-Centric Public Service Delivery V. The Administrator Checklist Section 1: Contributor Information 1.4. Additional contributor(s) to the questionnaire This information is for validation purpose only. 1.1. Primary contributor information It will not be publically released. This information is for validation purpose only. It will not be publically released. a. Title (e.g., Mr., Ms., Dr.): Name: Title (e.g., Mr., Ms., Dr.): Agency: First Name: Job title: Last Name: Email: Job title: Phone: Highest level of educational attainment: Address: r Primary education r Secondary education b. Title (e.g., Mr., Ms., Dr.): r Short-cycle tertiary r Bachelor’s degree or equivalent Name: r Master’s degree or equivalent Agency: r Doctoral degree or equivalent Job title: 1.2. Contact details Email: Name of public entity: Phone: Department/office name: Address: Website: 1.5. Explain the agency’s overall mandate (for example, Email address: the provision of education, health, employment, or Phone: social services). Describe any tangible services provided Mobile phone: to citizens. This could be, for example, the delivery 1.3. Agency Address of residency cards, social insurance registration, Street: professional/vocational training and life-long learning, City: or support for job seekers. Postal code: Region: Country: 25 Section 2: Access If citizens want to contact the agency regarding the delivery of a specific service can they do so through the 2.1. Providing clear contact information following means? 2.1.1. How do you communicate the agency’s mission to 2.1.8. An online form on the agency’s website users? Check all that apply. r Yes r No r Website r Social media r Display boards or billboards Link and/or comments: r Magazine or newspaper advertisements 2.1.9. A generic email address r Printed brochures r Yes r No 2.1.2. How do you communicate the agency’s contact Link and/or comments: information to users? Check all that apply. 2.1.10. A specific email address that will put the citizen r Website r Social media r Display boards or billboards in direct contact with the responsible division or r Magazine or newspaper advertisements department r Printed brochures r Yes r No If the agency does not have a website, skip to section 2.2. Link and/or comments: If the agency does have a website, answer questions 2.1.11. A generic phone number 2.1.3–2.1.17. r Yes r No Does the agency website allow users to identify the Link and/or comments: following in two or fewer clicks? 2.1.12. A specific phone number that will put the 2.1.3. The agency’s mission and responsibilities in citizen in direct contact with the responsible division terms of service delivery or department r Yes r No r Yes r No Link and/or comments: Link and/or comments: 2.1.4. General contact information for the agency 2.1.13. A clearly identified person, including name, r Yes r No position, and division or department Link and/or comments: r Yes r No Link and/or comments: 2.1.5. The responsibilities of specific departments in terms of service delivery 2.1.14. Online virtual assistance r Yes r No r Yes r No Link and/or comments: Link and/or comments: 2.1.6. Contact information for specific departments 2.1.15. Online chat functionality with an actual person and officials r Yes r No r Yes r No Link and/or comments: Link and/or comments: 2.1.16. Does the agency’s website include links to other 2.1.7. An overall organizational structure and chart organizations along with an explanation as to why a that includes the names of units and responsible citizen might want to contact them? persons r Yes r No r Yes r No Link and/or comments: Link and/or comments: 26 | Indicators of Citizen-Centric Public Service Delivery 2.1.17. Does the agency’s website include a search 2.3. Interacting with citizens function? How would you evaluate the agency in terms of the r Yes r No following? Link and/or comments: 2.3.1. The ease with which citizens can contact the agency See section 2.1 of the citizen survey to compare this self- r Poor r Below average r Average r Good r Excellent assessment with the views expressed by citizens. Comments: 2.2. Access channels and citizens’ preferences 2.3.2. Overall waiting times at the facility, with postal delivery, on the phone, or by email 2.2.1. Which of the following access channels can citizens r Poor r Below average r Average r Good r Excellent use to contact the agency? Check all that apply. Comments: r In-person, face-to-face interaction at a physical facility r Posted letter and/or facsimile 2.3.3. Number of public servants with which citizens must r Telephone interact to resolve issues r Email r Poor r Below average r Average r Good r Excellent r Online form on agency website Comments: r Tablet/smartphone application 2.3.4. User-friendly operating hours (such as lunchtime r Social media and evening hours that facilitate access for citizens 2.2.2. Have you asked citizens which access channels working full-time) they prefer using? r Poor r Below average r Average r Good r Excellent r Yes r No Comments: Comments: 2.3.5. The ease with which citizens can get to the facility See question 2.2.2 in the citizen survey to compare (such as ease of access with public transport) currently available access channels with the preferences r Poor r Below average r Average r Good r Excellent of citizens. Comments: 2.2.3. Is data about access channels used by citizens 2.3.6. User-friendly physical layout of facility (such as collected systematically? a clearly identifiable reception area, waiting areas with r Yes r No Comments: comfortable seating, and easy-access ramps for people 2.2.4. If yes, in the past 12 months, how many citizens with disabilities or parents with strollers) have contacted the agency using the following r Poor r Below average r Average r Good r Excellent channels? Comments: In-person, face-to-face: Posted letter and/or facsimile: See questions 2.3.1–2.3.6 of the citizen survey to compare Telephone (fixed line or mobile): this self-assessment with the views expressed by citizens. Email: Online form on the website: Tablet/smartphone applications: Social media: Other (specify): Total: V. The Administrator Checklist | 27 2.3.7. Are the agency’s services tailored to people with 2.4. E-government services/digital procedures special needs, including people with disabilities, the Online service delivery elderly, people living in remote areas, and people from 2.4.1. How many agency services are partially or fully lower socioeconomic backgrounds, among others? provided online? r Yes r No List services partially provided online: 2.3.8. If yes, how are services tailored to particular populations or groups (for example, special accessibility mechanisms for the visually or physically List services fully provided online: impaired, wheelchair-accessible design of facility, mobile service centers that bring services to remote segments of the population, or special efforts to Note: In the case of a company registering its name, the service would facilitate service delivery to citizens with low literacy be considered fully available online if the registration and administration levels)? approval processes are both possible online—without any paper or in-person visit by the entrepreneur required (European Commission 2012: 83). If the agency is unable to complete partial or full 2.3.9. Has the agency ever conducted accessibility transactions online, skip to question 2.4.12. testing of its services to assess how easily users with Does the agency use any of the following e-government various disabilities are able to access services, and then features identified by the European Commission (2014) used this information to improve service design and as key enablers for public services? implementation)? r Yes r No If yes, elaborate: 2.4.2. Electronic identification. Can citizens use a government-issued electronic form of identification 2.3.10. Is the agency’s paper documentation available in and authentication for the process? languages relevant to all population segments (such as r Yes r No Comments: other national languages or English if foreigners are likely to use its services)? 2.4.3. Single sign-on. Can users access multiple r Yes, fully available in more than one language systems without logging on multiple times? r Yes, partially available in more than one language r Yes r No Comments: r Not available in other languages 2.4.4. Electronic documents. Are authenticated If yes, elaborate: documents that are recognized by the public 2.3.11. Is the agency’s online documentation available in administration being used to allow users to send and languages relevant to all population segments (such as receive documents online, for example, by e-signature? other national languages or English if foreigners are likely r Yes r No Comments: to use its services)? r Yes, fully available in more than one language r Yes, partially available in more than one language r Not available in other languages If yes, elaborate: 28 | Indicators of Citizen-Centric Public Service Delivery 2.4.5. Authentic sources. Are base registries used to Privacy and identity management automatically validate or fetch data related to citizens 2.4.12. Does the agency’s website clearly indicate its or businesses, allowing online forms to be prefilled privacy policy? so they are received by the user either partly or fully r Yes r No completed? If yes, what is the link? r Yes r No Comments: 2.4.13. For each online service used, are citizens informed 2.4.6. Electronic safe (e-safe). Is there a virtual and if and why their personal data is being collected? secure repository for citizens to store and retrieve r Yes r No r Not applicable personal electronic data and documents? 2.4.14. How often do you implement regular security r Yes r No and management controls to prevent the inappropriate 2.4.7. Are online services presented according to various disclosure of sensitive information? citizen categories, such as student, entrepreneur, 2.4.15. Over the past 12 months, have there been any employee, and retired? hacking or cyberattack attempts on the agency? r Yes r No Comments: r Yes r No 2.4.8. Are the online services provided by the agency If yes: bundled by life event? How many attempts were made? r Yes r No Comments: How many attempts were successfully spotted and countered 2.4.9. Is a citizen’s progress openly tracked over the course by the agency’s cybersecurity measures? of an online service delivery transaction—that is, is it How many managed to infiltrate the system? made clear how many of the process steps the citizen has already accomplished and how many still remain to be Compare this self-assessment with the views expressed by done? citizens in question 2.4.10 of the citizen survey. r Yes r No Comments: Open data 2.4.10. Can users save their work as a draft over the course 2.4.16. Does the agency have an open data portal? of an online service delivery transaction, that is, can they r Yes r No return to the draft at a later time? If yes, what is the link? r Yes r No Comments: If no, is the agency providing at least some datasets to the public in 2.4.11. Is a demonstration available to help citizens make their entirety through bulk downloads and application programming use of online services while they conduct a transaction, interfaces (APIs)? such as a click-through presentation, an online video, or a r Yes r No downloadable manual that explains the necessary steps? If yes, what is the link? Note: An open data portal is a web-based interface, usually with specific search r Yes r No Comments: functionalities, designed to facilitate database searches. Application program- ming interfaces (APIs) are also often available, offering direct and automated access to data for software applications. V. The Administrator Checklist | 29 2.4.17. Does the agency have a public performance data Quality of website/applications dashboard, that is, an openly accessible, visual display of How would you evaluate the agency’s online interface in its performance data across several key metrics? terms of the following? r Yes r No 2.4.30. Ease of navigation If yes, what is the link? r Poor r Below average r Average r Good r Excellent Collection of relevant metrics r N/A Elaborate: Is the agency collecting the following common baseline 2.4.31. Presentation metrics for the agency’s website? r Poor r Below average r Average r Good r Excellent 2.4.18. Total visits r N/A Elaborate: r Yes r No If yes, previous month’s total: 2.4.32. Ease of downloading material 2.4.19. Total page views r Poor r Below average r Average r Good r Excellent r Yes r No If yes, previous month’s total: r N/A Elaborate: 2.4.20. Unique visitors 2.4.33. Information/documents available r Yes r No If yes, previous month’s total: r Poor r Below average r Average r Good r Excellent r N/A Elaborate: 2.4.21. Page views per visit r Yes r No If yes, previous month’s average: 2.4.34. Clarity of online forms r Poor r Below average r Average r Good r Excellent 2.4.22. Average visit duration r N/A Elaborate: r Yes r No If yes, previous month’s average: 2.4.35. Instructions, support and/or help functionalities 2.4.23. Time on page r Poor r Below average r Average r Good r Excellent r Yes r No If yes, previous month’s average: r N/A Elaborate: 2.4.24. Bounce rate r Yes r No If yes, previous month’s bounce rate: Compare the self-assessment in questions 2.4.30-2.4.35 with the views expressed by citizens in questions 2.4.25. New versus returning visitor 2.4.3–2.49 of the citizen survey. r Yes r No If yes, previous month’s ratio of new to returning visitors: Are the following elements available on the agency’s website? 2.4.26. Visits per visitor in a specified time frame r Yes r No If yes, elaborate: 2.4.36. A page for frequently asked questions r Yes r No Elaborate: 2.4.27. Total number of onsite search queries r Yes r No If yes, previous month’s total: 2.4.37. A live support functionality (click-to-chat) r Yes r No Elaborate: 2.4.28. Visitor composition r Yes r No If yes, elaborate: 2.4.29. Total interactions/connections via social media channels r Yes r No If yes, previous month’s total: 30 | Indicators of Citizen-Centric Public Service Delivery Section 3: User-Centered Service Delivery 3.2. Providing timely service and Responsiveness 3.2.1. List key services provided by the agency, corresponding service standards, and number and type 3.1. Providing a personalized service of supporting documents citizens need to access the 3.1.1. Under certain circumstances, does the agency services. Service standards are specific delivery targets proactively contact citizens to bring specific information or commitments established by the organization that to their attention? it promises to honor when delivering a service, such as r Yes r No r Not applicable delivery of document within three days, calls answered If yes: in 20 seconds, and 100 percent of citizens’ questions are addressed. 3.1.2. Explain the circumstances under which the 1. Service: agency proactively contact citizens (such as register- Service standards: ing on the electoral roll; renewing identification docu- Supporting documents required: ments; submitting income taxes; or receiving benefits in the event of a birth, loss of employment, 2. Service: or health incident). Service standards: Supporting documents required: 3.1.3. How does the agency usually contact citizens? 3. Service: r Posted mail Service standards: r Email Supporting documents required: r SMS r Telephone 3.2.2. Are time frames for various services systematically 3.1.4. Over the past 12 months, has the agency involved communicated to citizens during interactions/transactions citizens in the design of its services (that is, tapping into (that is, are citizens clearly informed of how much time it the knowledge of service users by providing them with will take to complete the entire process)? an opportunity to co-create the service delivery process r Yes r No Elaborate: by, for example, inviting citizens to participate in a role- 3.2.3. Do citizens receive status updates on the progress of playing activity to test prototypes)? their requests (either offline or online)? r Yes r No Elaborate: r Yes r No Elaborate: V. The Administrator Checklist | 31 3.2.4. Does the agency collect data on the time required 4.1.4. Knowledge/competence for it to deliver its services to citizens? r Poor r Below average r Average r Good r Excellent r Yes r No Elaborate: 3.2.5. If the answer to question 3.2.4 is “yes,” indicate 4.1.5. Do front-office staff have training opportunities in the percentage of services delivered within stipulated customer service? time frames (for example, 87 percent of identity cards r Yes r No Elaborate: are provided within a 15-day time frame, or 55 percent Compare this self-assessment with the views expressed in of health insurance cards are provided within a 7-day the questions in section 4.1 of the citizen survey. time frame). 3.2.6. In the last six months, how many citizens contacted 4.2. Providing clear, high-quality information the agency to request a status update on a request? 4.2.1. Quality of information and advice provided to citizens r Poor r Below average r Average r Good r Excellent 3.3. Setting service delivery standards in line with Elaborate: expectations 4.2.2. Effectiveness of information delivery 3.3.1. Has the agency consulted with citizens to identify r Poor r Below average r Average r Good r Excellent what they view as timely service? Elaborate: r Yes r No Elaborate: 4.2.3. Clarity of language used to provide information and 3.3.2. Does the agency’s service standards reflect citizens’ advice (for example, is content conveyed in plain language expectations? that citizens find easy to understand?) r Yes r No Elaborate: r Poor r Below average r Average r Good r Excellent Compare this self-assessment with the views expressed in Elaborate: questions 3.1.1–3.3.7 of the citizen survey. Compare this self-assessment with the views expressed in the questions in section 4.2 of the citizen survey. Section 4: Reliability and Quality of Service Delivery 4.3. Completing the procedure Evaluate the agency’s performance in terms of the 4.1. Interacting with citizens following: Evaluate the agency’s citizen-facing staff in terms of 4.3.1. Value-for-money/cost for services the following: r Poor r Below average r Average r Good r Excellent 4.1.1. Politeness Elaborate: r Poor r Below average r Average r Good r Excellent 4.3.2. Paperless procedures Elaborate: r Poor r Below average r Average r Good r Excellent 4.1.2. Fairness Elaborate: r Poor r Below average r Average r Good r Excellent 4.3.3. Streamlined internal processes Elaborate: r Poor r Below average r Average r Good r Excellent 4.1.3. Helpfulness Elaborate: r Poor r Below average r Average r Good r Excellent Elaborate: 32 | Indicators of Citizen-Centric Public Service Delivery 4.3.4. Ease of processes for citizens Customer journey mapping provides an overview of the user experience by telling r Poor r Below average r Average r Good r Excellent the story of a customer from initial contact, through the process of engagement, Elaborate: and into a long-term relationship. It identifies key interactions between the cus- tomer and the organization, and examines the user’s feelings, motivations, and 4.3.5. Number of documents citizens must submit questions relating to these touchpoints. It is a useful tool for identifying potential r Poor r Below average r Average r Good r Excellent pain points, such as gaps between devices, departments, or channels; and it puts Elaborate: users at the center of the organization’s thinking. 4.3.6. Over the last 12 months, has the agency taken any steps toward administrative simplification, such as process reengineering activities? Section 5: Public Sector Integrity r Yes r No Elaborate: 5.1. Embodying a transparent, corruption-free, and Compare this self-assessment with the views expressed in effective public sector the questions in section 4.3 of the citizen survey. Does the agency publish any of the following documents online? 4.4. Reaching a satisfactory outcome for citizens 5.1.1. Current budget figures 4.4.1. Is the agency capturing data about citizen r Yes r No Elaborate: satisfaction? r Yes r No 5.1.2. Current budget figures in a clear and understandable If yes, elaborate (for example, through user surveys, focus groups, or way (citizens’ budget format) user panels): r Yes r No Elaborate: 4.4.2. Does front-line staff report insights gathered 5.1.3. Past budget figures for the last three years at through direct interaction with users for continuous minimum improvement purposes? r Yes r No Elaborate: r Yes r No Elaborate: 5.1.4. Contracts signed with third parties, including names 4.4.3. Is the agency testing the suitability and strength of of parties, contract value, subject, date of publishing, and its service delivery through mystery shopping, usability termination testing, and/or customer journey mapping? r Yes r No Elaborate: r Yes r No Elaborate: 5.1.5. Search tools for contracts (for example, by date Note: Mystery shopping is a technique where trained individuals pretend to be and supplier) potential customers or service users and report back on their experiences in a r Yes r No Elaborate: detailed and objective way. It differs from other research techniques in that evalu- 5.1.6. Annual report ators do not declare themselves to the service provider during the interaction. r Yes r No Elaborate: Usability testing consists of small-scale (3–5 users) or large–scale (20–100 users) 5.1.7. User fees for each service provided qualitative tests for service providers to observe user behavior and ability to r Yes r No Elaborate: complete tasks. It is commonly used to measure metrics such as error rate, 5.1.8. Job openings number of clicks, and time spent as well as to collect general feedback on the r Yes r No Elaborate: experience of users. V. The Administrator Checklist | 33 Regarding access to information requests: 5.1.22. How likely are staff members who are 5.1.9. Is there an established institutional mechanism involved in delivering services to accept (or ask for) through which citizens can request the agency’s records? something in return for carrying out the transaction? r Yes r No Elaborate: List all services below. Service 1: 5.1.10. If the answer to question 5.1.9 is “yes,” please r Not likely at all r Rather unlikely r Neither likely nor unlikely indicate the legal basis for this. r Rather likely r Very likely r Do not know Service 2: 5.1.11. How many access-to-information requests r Not likely at all r Rather unlikely r Neither likely nor unlikely regarding agency information or records has the agency r Rather likely r Very likely r Do not know received over the past 12 months? Service 3: 5.1.12. How many access-to-information requests were r Not likely at all r Rather unlikely r Neither likely nor unlikely denied over the past 12 months? r Rather likely r Very likely r Do not know Does the agency have any of the following? How effective are the agency’s policies and mechanisms in 5.1.13. An ethics officer place to avoid the following: r Yes r No Elaborate: 5.1.23. Favoritism within the organization 5.1.14. A clear whistleblower protection policy r Poor r Below average r Average r Good r Excellent r Yes r No Elaborate: 5.1.24. Bribes 5.1.15. A code of ethics/conduct for staff r Poor r Below average r Average r Good r Excellent r Yes r No Elaborate: 5.1.25. Flawed public procurement If the answer to question 5.1.15 is “yes,” does the code r Poor r Below average r Average r Good r Excellent of ethics/conduct address the following? 5.1.26. Discrimination toward users 5.1.16. Conflict of interest resolution r Poor r Below average r Average r Good r Excellent r Yes r No Elaborate: 5.2. Providing feedback and complaint-handling 5.1.17. Abuse of public power, information obtained mechanisms in office, and/or trust of superiors to gain undue 5.2.1. Is there an established institutional mechanism advantage through which citizens can provide feedback about any r Yes r No Elaborate: services received that goes beyond the provision of 5.1.18. Gifts and benefits detailed contact information, such as user satisfaction r Yes r No Elaborate: monitoring, polls, or surveys? 5.1.19. Postemployment behavior and limitations r Yes r No Elaborate: r Yes r No Elaborate: 5.1.20. Code of conduct for public procurement r Yes r No Elaborate: 5.1.21. Sanctions for breach of the code of ethics/ conduct? r Yes r No Elaborate: 34 | Indicators of Citizen-Centric Public Service Delivery 5.2.2. If yes, through which of the following channels 5.3.2. How would you evaluate the quality of cooperation can citizens express their views? between the agency and other involved entities? r Face-to-face Elaborate: r Poor r Below average r Average r Good r Excellent r Website Elaborate: 5.3.3. Are the existing legislation, memoranda of r Text message (SMS) Elaborate: understanding, and bilateral agreements adequate to r Email Elaborate: foster effective cooperation? r Telephone Elaborate: r Yes r No Elaborate: r Social media Elaborate: 5.3.4. Does the agency share the same business processes r Paper form Elaborate: as the other involved entities? 5.2.3. Does the agency use social media and other third- r Yes r No Elaborate: party platforms to listen to and serve citizens? 5.3.5. Does the agency share the same strategic vision as r Yes r No Elaborate: the other involved entities? If yes, specify: r Yes r No Elaborate: 5.2.4. Does the agency analyze the citizen feedback it 5.3.6. Does poor cooperation sometimes cause delays? receives? r Yes r No Elaborate: r Yes r No Elaborate: 5.3.7. Do technical problems/incompatibilities (such as 5.2.5. Does the agency provide citizens with a dedicated the use of multiple information technology systems) slow way to file complaints about service delivery, such down cooperation? as a hotline or online form to report dissatisfaction or r Yes r No Elaborate: illegal/corrupt practices? 5.3.8. Is there an interagency management information r Yes r No Elaborate: system that enables coherent data management and If answer to question 5.2.5 is “yes”: avoids replication of data or repeated submission of 5.2.6. Are time frames for resolution stipulated? documents for citizens? r Yes r No Elaborate: r Yes r No Elaborate: 5.2.7. How many complaints were received over the past 12 months? 5.2.8. How many of these complaints were resolved over the past 12 months? 5.2.9. How many complaints were resolved within the stipulated time frames? 5.3. Improving interagency cooperation 5.3.1. With which other entities, if any, does the agency coordinate to deliver the services for which it is responsible? V. The Administrator Checklist | 35 Section 6: Final Comments Compare this self-assessment with the views expressed in question 6.1. of the citizen survey. 6.1. What do you think the agency’s priority area 6.2. What support does the agency need to improve the should be for improving the delivery of public services? priority area selected in question 6.1? Select one. r Simplify access to services (such as through one-stop shops) r Improve quality of services r Reduce cost of services r Motivate staff 6.3. Are there any additional comments, suggestions, r Improve timeliness questions, and concerns you would like to share? r Mainstream/improve digital procedures r Improve transparency/reduce corruption and nepotism Simplify processes (including reduction of red-tape r  and paperwork) r Other (please specify): End of checklist. 36 | Indicators of Citizen-Centric Public Service Delivery Appendix A. Administrator Checklist as Filled Out by a Municipal Registry Office: Illustrative Example Section 1: Contributor Information 1.4. Additional contributor(s) to the questionnaire This information is for validation purpose only. 1.1. Primary contributor information It will not be publically released. This information is for validation purpose only. It will not be publically released. a. Title (e.g., Mr., Ms., Dr.): Name: Title (e.g., Mr., Ms., Dr.): Ms. Agency: First Name: Imaginary Job title: Last Name: Magistrate Email: Job title: Registry Officer Phone: Highest level of educational attainment: Address: r Primary education r Secondary education b. Title (e.g., Mr., Ms., Dr.): r Short-cycle tertiary r Bachelor’s degree or equivalent Name: r Master’s degree or equivalent 7 Agency: r Doctoral degree or equivalent Job title: 1.2. Contact details Email: Name of public entity: Municipal government Phone: Department/office name: Registry Office Address: Website: www.oldtown.org 1.5. Explain the agency’s overall mandate (for example, Email address: imaginary.magistrate@municipality.org the provision of education, health, employment, or Phone: +123456789 social services). Describe any tangible services provided Mobile phone: to citizens, such as residency cards, social insurance 1.3. Agency Address registration, professional/vocational training and life-long Street: 10 Old Town Street learning, and support for job seekers. City: Capital City Our agency is providing administrative services to citizens. Specifically, Postal code: 12345 we are providing them with marriage, birth, and death certificates. Region: Main region Country: Illustrative 37 Section 2: Access If citizens want to contact the agency regarding the delivery of a specific service can they do so through the 2.1. Providing clear contact information following means? 2.1.1. How do you communicate the agency’s mission to 2.1.8. An online form on the agency’s website users? Check all that apply. 7 No r Yes r 7 r Website r Social media r Display boards or billboards Link and/or comments: r Magazine or newspaper advertisements 2.1.9. A generic email address r Printed brochures 7 Yes r No r 2.1.2. How do you communicate the agency’s contact Link and/or comments: information to users? Check all that apply. 2.1.10. A specific email address that will put the citizen r Website 7 r Social media r Display boards or billboards in direct contact with the responsible division or r Magazine or newspaper advertisements department r Printed brochures 7 Yes r No r If the agency does not have a website, skip to section 2.2. Link and/or comments: If the agency does have a website, answer questions 2.1.11. A generic phone number 2.1.3–2.1.17. 7 Yes r No r Does the agency website allow users to identify the Link and/or comments: following in two or fewer clicks? 2.1.12. A specific phone number that will put the 2.1.3. The agency’s mission and responsibilities in citizen in direct contact with the responsible division terms of service delivery or department 7 Yes r No r 7 Yes r No r Link and/or comments: www.oldtown.org/registryoffice Link and/or comments: 2.1.4. General contact information for the agency 2.1.13. A clearly identified person, including name, 7 Yes r No r position, and division or department Link and/or comments: 7 Yes r No r Link and/or comments: 2.1.5. The responsibilities of specific departments in terms of service delivery 2.1.14. Online virtual assistance 7 Yes r No r 7 No r Yes r Link and/or comments: Link and/or comments: 2.1.6. Contact information for specific departments 2.1.15. Online chat functionality with an actual person and officials 7 No r Yes r 7 Yes r No r Link and/or comments: Link and/or comments: 2.1.16. Does the agency’s website include links to other 2.1.7. An overall organizational structure and chart organizations along with an explanation as to why a that includes the names of units and responsible citizen might want to contact them? persons 7 Yes r No r 7 Yes r No r Link and/or comments: Only partly. Our website includes links to the Ministry of Interior but it could include more (e.g., archives, Link and/or comments: immigration police, department of labor). For now, advice on where else citizens can go is given only face-to-face. 38 | Indicators of Citizen-Centric Public Service Delivery 2.1.17. Does the agency’s website include a search 2.3. Interacting with citizens function? How would you evaluate the agency in terms of the 7 Yes r No r following? Link and/or comments: 2.3.1. The ease with which citizens can contact the agency See section 2.1 of the citizen survey to compare this self- r Good r Excellent r Poor r Below average r Average 7 assessment with the views expressed by citizens. Comments: 2.2. Access channels and citizens’ preferences 2.3.2. Overall waiting times at the facility, with postal delivery, on the phone, or by email 2.2.1. Which of the following access channels can citizens r Excellent r Poor r Below average r Average r Good 7 use to contact the agency? Check all that apply. Comments: r In-person, face-to-face interaction at a physical facility 7 r Posted letter and/or facsimile 2.3.3. Number of public servants with which citizens must 7 r Telephone interact to resolve issues 7 r Email r Good r Excellent r Poor r Below average r Average 7 7 r Online form on agency website Comments: r Tablet/smartphone application 2.3.4. User-friendly operating hours (such as lunchtime r Social media and evening hours that facilitate access for citizens 2.2.2. Have you asked citizens which access channels working full-time) they prefer using? 7 Below average r Average r Good r Excellent r Poor r r Yes 7 r No Comments: Comments: Opening times are Mondays and Wednesdays 8:00 a.m. to 5:00 p.m., with a break from 12:00 p.m. to 12:30 p.m. 2.2.3. Is data about access channels used by citizens collected systematically? 2.3.5. The ease with which citizens can get to the facility r Yes 7 r No Comments: (such as ease of access with public transport) 7 Average r Good r Excellent r Poor r Below average r 2.2.4. If yes, in the past 12 months, how many citizens Comments: have contacted the agency using the following channels? 2.3.6. User-friendly physical layout of facility, such as a In-person, face-to-face: 6,500 clearly identifiable reception area, waiting areas with Posted letter and/or facsimile: 300 comfortable seating, and easy-access ramps for people Telephone (fixed line or mobile): 18,000 with disabilities or parents with strollers Email: 3,600 7 Average r Good r Excellent r Poor r Below average r Online form on the website: x Comments: Tablet/smartphone applications: x See questions 2.3.1–2.3.6 of the citizen survey to compare Social media: x this self-assessment with the views expressed by citizens. Other (specify): x Total: 28,400 See question 2.2.2 in the citizen survey to compare currently available access channels with the preferences of citizens. Appendix 1. Administrator Checklist as Filled Out by a Municipal Registry Office: Illustrative Example | 39 2.3.7. Are the agency’s services tailored to people with If yes, elaborate: We can refer users to the website of the migration special needs, including people with disabilities, the information center, which contains information about many common elderly, people living in remote areas, and people from situations in English. lower socioeconomic backgrounds, among others? 2.4. E-government services/digital procedures 7 No r Yes r Online service delivery 2.3.8. If yes, how are services tailored to particular 2.4.1. How many agency services are partially or fully populations or groups (for example, special provided online? 3 accessibility mechanisms for the visually or physically List services partially provided online: impaired, wheelchair-accessible design of facility, List services fully provided online: Citizens can request and mobile service centers that bring services to remote receive duplicates of (1) birth, (2) marriage, and (3) death certificates segments of the population, or special efforts to online. However, this service was established only one year ago, and facilitate service delivery to citizens with low literacy it is sometimes still faster to receive duplicates in person than online. levels)? Note: In the case of a company registering its name, the service would be considered fully available online if the registration and administration approval processes are both possible online—without any paper or in-person 2.3.9. Has the agency ever conducted accessibility visit by the entrepreneur required (European Commission 2012: 83). testing of its services to assess how easily users with If the agency is unable to complete partial or full various disabilities are able to access services, and then transactions online, skip to question 2.4.12. used this information to improve service design and implementation)? Does the agency use any of the following e-government r Yes r 7 No If yes, elaborate: features identified by the European Commission (2014) as key enablers for public services? 2.3.10. Is the agency’s paper documentation available in languages relevant to all population segments (such as 2.4.2. Electronic identification. Can citizens use a other national languages or English if foreigners are likely government-issued electronic form of identification to use its services)? and authentication for the process? r Yes, fully available in more than one language 7 Yes r No r Comments: They can use the latest form of 7 Yes, partially available in more than one language r citizen identification card issued, which contains an electronic chip. r Not available in other languages 2.4.3. Single sign-on. Can users access multiple If yes, elaborate: A few years ago, we developed a multilingual docu- systems without logging on multiple times? ment in partnership with the International Organization for Migration Citizens can use the national 7 Yes r No r Comments: that explained procedures in Vietnamese, Chinese, English, Russian, e-government platform. and Arabic. 2.4.4. Electronic documents. Are authenticated 2.3.11. Is the agency’s online documentation available in documents that are recognized by the public languages relevant to all population segments (such as administration being used to allow users to send and other national languages or English if foreigners are likely receive documents online, for example, by e-signature? to use its services)? 7 No r Yes r Comments: r Yes, fully available in more than one language r Yes, partially available in more than one language 7 r Not available in other languages 40 | Indicators of Citizen-Centric Public Service Delivery 2.4.5. Authentic sources. Are base registries used to 2.4.13. Does the agency have a public performance data automatically validate or fetch data related to citizens dashboard, that is, an openly accessible, visual display of or businesses, allowing online forms to be prefilled its performance data across several key metrics? so they are received by the user either partly or fully 7 No r Yes r completed? If yes, what is the link? 7 No r Yes r Comments: Collection of relevant metrics 2.4.6. Electronic safe (e-safe). Is there a virtual and Is the agency collecting the following common baseline secure repository for citizens to store and retrieve metrics for the agency’s website? personal electronic data and documents? r Yes r 2.4.14. Total visits 7 No r Yes r No If yes, previous month’s total: The registry office itself does not have the responsibility for the national 2.4.15. Total page views e-government platform that delivers the duplicates. Therefore, ques- r Yes r No If yes, previous month’s total: tions 2.4.7–2.4.15 on the quality of online service delivery and privacy 2.4.16. Unique visitors and identity management do not apply. They should be raised with the r Yes r No If yes, previous month’s total: ministry of interior, which is in charge of e-government services, and 2.4.17. Page views per visit they have therefore been deleted from this self-assessment. The regis- r Yes r No If yes, previous month’s average: try office did receive approximately 4,500 requests for duplicates in the last year, and only 3 of them were submitted online. 2.4.18. Average visit duration r Yes r No If yes, previous month’s average: Open data 2.4.19. Time on page 2.4.12. Does the agency have an open data portal? r Yes r No If yes, previous month’s average: 7 No r Yes r If yes, what is the link? 2.4.20. Bounce rate If no, is the agency providing at least some datasets to the public in r Yes r No If yes, previous month’s bounce rate: their entirety through bulk downloads and application programming 2.4.21. New versus returning visitor interfaces (APIs)? r Yes r No If yes, previous month’s ratio of new to 7 No r Yes r returning visitors: If yes, what is the link? While we do not have an open data portal, the 2.4.22. Visits per visitor in a specified time frame statistics office publishes the number of services offered by the registry r Yes r No If yes, elaborate: office on a regular basis. In addition, the registry office sends an annual 2.4.23. Total number of onsite search queries report to the Ministry of Interior that includes data about the year’s r Yes r No If yes, previous month’s total: activities. Note: An open data portal is a web-based interface, usually with specific search 2.4.24. Visitor composition functionalities, designed to facilitate database searches. Application program- r Yes r No If yes, elaborate: ming interfaces (APIs) are also often available, offering direct and automated 2.4.25. Total interactions/connections via social media access to data to software applications. channels r Yes r No If yes, previous month’s total: These metrics can be accessed by the IT team but they are currently not being used for analysis. Appendix 1. Administrator Checklist as Filled Out by a Municipal Registry Office: Illustrative Example | 41 Quality of website/applications Section 3: User-Centered Service Delivery How would you evaluate the agency’s online interface in and Responsiveness terms of the following? 3.1. Providing a personalized service 2.4.26. Ease of navigation 3.1.1. Under certain circumstances, does the agency 7 Good r Excellent r Poor r Below average r Average r proactively contact citizens to bring specific information to r N/A Elaborate: their attention? 2.4.27. Presentation 7 Not applicable r Yes r No r 7 Average r Good r Excellent r Poor r Below average r While our website contains all the necessary If yes: r N/A Elaborate: information, its design is a bit outdated. 3.1.2. Explain the circumstances under which the agency proactively contact citizens (such as register- 2.4.28. Ease of downloading material ing on the electoral roll; renewing identification docu- r Poor r Below average r Average r Good r Excellent ments; submitting income taxes; or receiving benefits r N/A 7 Elaborate: in the event of a birth, loss of employment, 2.4.29. Information/documents available or health incident). 7 Good r Excellent r Poor r Below average r Average r r N/A Elaborate: 2.4.30. Clarity of online forms 3.1.3. How does the agency usually contact citizens? r Poor r Below average r Average r Good r Excellent r Posted mail 7 r N/A Elaborate: r Email 2.4.31. Instructions, support and/or help functionalities r SMS 7 Average r Good r Excellent r Poor r Below average r r Telephone r N/A Elaborate: 3.1.4. Over the past 12 months, has the agency involved Compare the self-assessment in questions 2.4.30-2.4.35 citizens in the design of its services (that is, tapping into with the views expressed by citizens in questions the knowledge of service users by providing them with 2.4.3-2.49 of the citizen survey. an opportunity to co-create the service delivery process by, for example, inviting citizens to participate in a role- Are the following elements available on the agency’s playing activity to test prototypes)? website? 7 No r Yes r Elaborate: Because the process is set by the law. 2.4.32. A page for frequently asked questions r Yes 7 r No Elaborate: 2.4.33. A live support functionality (click-to-chat) r Yes 7 r No Elaborate: However, citizens with unanswered questions can “ask the mayor” on the same website. The municipality also has a Facebook page with information for the public. 42 | Indicators of Citizen-Centric Public Service Delivery 3.2. Providing timely service 3.2.4. Does the agency collect data on the time required 3.2.1. List key services provided by the agency, for it to deliver its services to citizens? corresponding service standards, and number and type 7 No r Yes r of supporting documents citizens need to access the 3.2.5. If the answer to question 3.2.4 is “yes,” indicate services. Service standards are specific delivery targets the percentage of services delivered within stipulated or commitments established by the organization that time frames (for example, 87 percent of identity cards it promises to honor when delivering a service, such as are provided within a 15-day time frame, or 55 percent delivery of document within three days, calls answered of health insurance cards are provided within a 7-day in 20 seconds, and 100 percent of citizens’ questions are time frame). All documents are delivered immediately, assuming the citizen has brought the correct documentation. addressed. Waiting times can vary seasonally but are always under 1.5 hours. 1. Service: Wedding certificate Service standards: Delivered immediately (under 30 minutes) 3.2.6. In the last six months, how many citizens contacted Supporting documents required: Identification card and birth the agency to request a status update on a request? certificate; in special cases, certificates of past divorces and death Not applicable certificate for widows 3.3. Expected service delivery standards 2. Service: Birth certificate Service standards: Delivered immediately (under 30 minutes) 3.3.1. Has the agency consulted with citizens to identify Supporting documents required: Identification card of parents and what they view as timely service? wedding certificate or declaration of parenthood 7 No r Yes r Elaborate: 3. Service: Death certificate 3.3.2. Does the agency’s service standards reflect citizens’ Service standards: Delivered immediately (under 30 minutes) expectations? Supporting documents required: Doctor’s certificate, 7 Yes r No r Elaborate: identification card Compare this self-assessment with the views expressed in 3.2.2. Are time frames for various services systematically questions 3.1.1–3.3.7 of the citizen survey. communicated to citizens during interactions/transactions (that is, are citizens clearly informed of how much time it Section 4: Reliability and Quality of Service will take to complete the entire process)? Delivery 7 r Yes r No Elaborate: 3.2.3. Do citizens receive status updates on the progress of 4.1. Interacting with citizens their requests (either offline or online)? Evaluate the agency’s citizen-facing staff in terms of r Yes r No Elaborate: Not applicable the following: 4.1.1. Politeness 7 Excellent r Poor r Below average r Average r Good r Elaborate: 4.1.2. Fairness 7 Excellent r Poor r Below average r Average r Good r Elaborate: Appendix 1. Administrator Checklist as Filled Out by a Municipal Registry Office: Illustrative Example | 43 4.1.3. Helpfulness 4.3. Completing the procedure r Poor r Below average r Average r Good 7 r Excellent Evaluate the agency’s performance in terms of the Elaborate: following: 4.1.4. Knowledge/competence 4.3.1. Value-for-money/cost for services r Poor r Below average r Average r Good 7 r Excellent 7 Good r Excellent r Poor r Below average r Average r Elaborate: Elaborate: Compare this self-assessment with the views expressed in 4.3.2. Paperless procedures the questions in section 4.1 of the citizen survey. 7 Below average r Average r Good r Excellent r Poor r 4.1.5. Do front-office staff have training opportunities in Elaborate: customer service? 4.3.3. Streamlined internal processes 7 r Yes r No Elaborate: These do not take place regularly, but 7 Good r Excellent r Poor r Below average r Average r staff have had the opportunity to take part in a one-off training for Elaborate: all municipal employees, which included team-building exercises, 4.3.4. Ease of processes for citizens coaching on presentation skills, and conflict and crisis management, 7 Excellent r Poor r Below average r Average r Good r and which mixed different departments, including social affairs and Elaborate: information technology. We think such trainings could be beneficial 4.3.5. Number of documents citizens must submit every two years, notably to improve cross-departmental cooperation. 7 Average r Good r Excellent r Poor r Below average r 4.2. Providing clear, high-quality information Elaborate: 4.2.1. Quality of information and advice provided to Compare this self-assessment with the views expressed in citizens the questions in section 4.3 of the citizen survey. 7 Good r Excellent r Poor r Below average r Average r 4.3.6. Over the last 12 months, has the agency taken Elaborate: any steps toward administrative simplification, such as 4.2.2. Effectiveness of information delivery process reengineering activities? 7 Good r Excellent r Poor r Below average r Average r 7 Yes r No r Elaborate: The online delivery of duplicates was Elaborate: introduced. 4.2.3. Clarity of language used to provide information and 4.4. Reaching a satisfactory outcome for citizens advice (for example, is content conveyed in plain language that citizens find easy to understand?) 4.4.1. Is the agency capturing data about citizen r Poor r Below average r Average r satisfaction? 7 Good r Excellent Elaborate: 7 Yes r No r If yes, elaborate (for example, through user surveys, focus groups, or Compare this self-assessment with the views expressed in user panels): Paper-based feedback forms can be completed at the the questions in section 4.2 of the citizen survey. information desk. This takes place in the framework of a survey of the whole municipality evaluating overall citizen satisfaction. 44 | Indicators of Citizen-Centric Public Service Delivery 4.4.2. Does front-line staff report insights gathered Section 5: Public Sector Integrity through direct interaction with users for continuous improvement purposes? 5.1. Embodying a transparent, corruption-free, and 7 Yes r No r Elaborate: During weekly meetings of the whole team effective public sector (staff of six), suggestions can be brought forward to improve processes Questions 5.1.1 to 5.1.21 regarding the publication of budget figures, (e.g., handling complex cases or changes to legal framework). contracts, access-to-information requests, and code of ethics are not specifically relevant to the registry office but instead to the whole 4.4.3. Is the agency testing the suitability and strength of municipality. The same applies to the question on flawed public pro- its service delivery through mystery shopping, usability curement. These questions have therefore not been included in this testing, and/or customer journey mapping? self-assessment. r Yes r No Elaborate: Every four years, the municipality uses 7 mystery shopping to measure the quality of its service delivery, 5.1.22. How likely are staff members who are including with the registry office. involved in delivering services to accept (or ask for) Note: Mystery shopping is a technique where trained individuals pretend to be something in return for carrying out the transaction? potential customers or service users and report back on their experiences in a List all services below. detailed and objective way. It differs from other research techniques in that evalu- Service 1: Delivery of birth certificate ators do not declare themselves to the service provider during the interaction. 7 r Not likely at all r Rather unlikely r Neither likely nor unlikely r Rather likely r Very likely r Do not know Usability testing small-scale (3–5 users) or large–scale (20–100 users) qualitative Service 2: Delivery of marriage certificate tests for service providers to observe user behavior and ability to complete tasks. 7 r Not likely at all r Rather unlikely r Neither likely nor unlikely It is commonly used to measure metrics such as error rate, number of clicks, and r Rather likely r Very likely r Do not know time spent as well as to collect general feedback on the experience of users. Service 3: Delivery of death certificate Customer journey mapping provides an overview of the user experience by telling 7 Not likely at all r Rather unlikely r Neither likely nor unlikely r a story of a customer from initial contact, through the process of engagement, r Rather likely r Very likely r Do not know and into a long-term relationship. It identifies key interactions between the How effective are the agency’s policies and mechanisms in customer and organization, and examines the user’s feelings, motivations, and place to avoid the following: questions relating to these touchpoints. It is a useful tool for identifying potential pain points, such as gaps between devices, departments, or channels; and it puts 5.1.23. Favoritism within the organization users at the center of the organization’s thinking. 7 Excellent r Poor r Below average r Average r Good r 5.1.24. Bribes 7 Excellent r Poor r Below average r Average r Good r 5.1.25. Flawed public procurement 7 Average r Good r Excellent r Poor r Below average r 5.1.26. Discrimination toward users 7 Excellent r Poor r Below average r Average r Good r Appendix 1. Administrator Checklist as Filled Out by a Municipal Registry Office: Illustrative Example | 45 5.2. Providing feedback and complaint-handling 5.3. Improving interagency cooperation mechanisms 5.3.1. With which other entities, if any, does the agency 5.2.1. Is there an established institutional mechanism coordinate to deliver the services for which it is through which citizens can provide feedback about any responsible? Primarily the social and culture departments services received that goes beyond the provision of 5.3.2. How would you evaluate the quality of cooperation detailed contact information, such as user satisfaction between the agency and other involved entities? monitoring, polls, or surveys? r Poor r Below average r Average 7 r Good r Excellent 7 r Yes r No Elaborate: Customer satisfaction is collected through 5.3.3. Are the existing legislation, memoranda of the periodical paper-based surveys mentioned earlier. understanding, and bilateral agreements adequate to 5.2.2. If yes, through which of the following channels foster effective cooperation? can citizens express their views? 7 r Yes r No Elaborate: r Face-to-face Elaborate: 5.3.4. Does the agency share the same business processes 7 Website Elaborate: The municipal website r as the other involved entities? allows citizens to get in touch with the mayor for comments, complaints, and suggestions. 7 r Yes r No Elaborate: r Text message (SMS) Elaborate: 5.3.5. Does the agency share the same strategic vision as 7 Email Elaborate: r the other involved entities? r Telephone Elaborate: 7 No r Yes r Elaborate: r Social media Elaborate: 5.3.6. Does poor cooperation sometimes cause delays? 7 Paper form r Elaborate: Only periodically 7 r Yes r No Elaborate: 5.2.3. Does the agency use social media and other third- 5.3.7. Do technical problems/incompatibilities (such as party platforms to listen to and serve citizens? the use of multiple information technology systems) slow 7 No r Yes r Elaborate: down cooperation? If yes, specify: 7 No r Yes r Elaborate: 5.2.4. Does the agency analyze the citizen feedback it 5.3.8. Is there an interagency management information receives? system that enables coherent data management and 7 r Yes r No Elaborate: avoids replication of data or repeated submission of 5.2.5. Does the agency provide citizens with a dedicated documents for citizens? 7 No r Yes r Elaborate: Due to the confidential nature of the way to file complaints about service delivery, such information that the registry office deals with, this data cannot be shared. as a hotline or online form to report dissatisfaction or illegal/corrupt practices? 7 r Yes r No Elaborate: Citizens can email the mayor directly. Questions 5.2.6–5.2.9 are deleted because the complaint handling mechanism goes through the mayor’s office. 46 | Indicators of Citizen-Centric Public Service Delivery Section 6: Final Comments 6.2. What support does the agency need to improve the priority area selected in question 6.1? 6.1. What do you think the agency’s priority area To improve digital procedures, cooperation with the ministry of interior should be for improving the delivery of public services? should be improved. Indeed, the registry office’s requirements are not Select one. necessarily taken into consideration. Two examples are the fact that the r Simplify access to services (such as through one-stop shops) paper form for wedding requests could be digitalized and the online r Improve quality of services forms could be prefilled once the social security number of the person r Reduce cost of services is indicated into one category. This would save time and simplify the r Motivate staff process for citizens, who would have less paperwork to fill out manually. r Improve timeliness However, because the e-government portal is managed by an external 7 r Mainstream/improve digital procedures agency, each required change involves a money and time commitment r Improve transparency/reduce corruption and nepotism that the ministry of interior may not be willing to make. Simplify processes (including reduction of red-tape r  and paperwork) 6.3. Are there any additional comments, suggestions, r Other (please specify): questions, and concerns you would like to share? See summary of insights on following page. Compare this self-assessment with the views expressed in question 6.1. of the citizen survey. End of checklist. Appendix 1. Administrator Checklist as Filled Out by a Municipal Registry Office: Illustrative Example | 47 Summary of insights gathered through Public sector integrity the self-assessment checklist • The registry office is conducting irregular citizen satisfaction Based on this self-assessment of the registry office, the following surveys in paper form as part of an evaluation exercise conducted conclusions can be drawn: on behalf of the whole municipality. Access • Overall, there do not seem to be major issues with corruption, • The registry office’s website provides all the necessary information bribes, or favoritism. for citizens to find out who to get in touch with. The level of detail • Cooperation with other services seems good, but communication of the information provided seems adequate. channels with the ministry of interior could be strengthened. • The website may benefit from an improvement in terms of Based on these findings, the registry office could consider: presentation. • Adding some questions from the citizen survey template to the • The website could include more links and advice to citizens regard- periodical satisfaction surveys it conducts in paper form. ing where to turn to for various enquiries related to life events that • Conducting a quick, one-off, or an ongoing/rolling survey of citizens the registry office handles (births, weddings, and deaths). immediately after receiving the certificate they were seeking to • The website does not provide an equivalent level of information for examine if citizens’ opinions confirm impressions from the self- nonnative speakers and could benefit from a more developed trans- assessment. lation into English to facilitate access to information for foreigners. • Adding some questions to the citizen survey to evaluate the per- • Opening hours (Mondays and Wednesdays 8:00 a.m. to 5:00 p.m.) ceived burden of paper procedures and reasons for the low uptake are below average and could be extended to facilitate access to of e-government procedures. working citizens. • Access for people with disabilities could be improved. Concretely, the registry office could, for example, decide to administer • The registry office could communicate its activities and results more the following, shortened citizen survey, which consists of a selection of clearly to citizens. 30 questions deemed most relevant for its services. Some sections User-centered service delivery (e.g., section on public sector integrity) have been left out, as they do • The registry office seems to be performing well in terms of not seem to be burning issues for the agency. The focus of the selected timeliness. questions is on understanding the drivers of citizen satisfaction with the • The registry office does not seem to impose too much of a burden registry’s services. Any variable not directly relevant to the registry office on citizens in terms of documentation to provide for the delivery of has been dropped (e.g., in the section collecting information about the certificates. respondent, the categories of income level and postal code have been Reliability/quality of service delivery deleted). A special question has been added to probe whether the • Levels of customer service seem to be high registry’s perception that the digitalization of data entry would be (self-assessed as excellent). beneficial is also shared by citizens: “I had to fill out too many forms • Paperless procedures have been scored as below average and could manually.” Two questions also enquire about citizens’ awareness of the thus be improved. possibility to request duplicates of certificates online and examine possi- • There seem to be opportunities to have more trainings for staff. ble reasons why citizens may not want to use this channel. • The registry office appears to make an effective and regular use of mystery shopping techniques. 48 | Indicators of Citizen-Centric Public Service Delivery Appendix B. Citizen Survey as Customized by a Municipal Registry Office: Illustrative Example Section 1: Respondent Information Once you had identified the right website/address/contact person, how satisfied or dissatisfied were you 1.1. Gender: with the following? r Male r Female 2.1.2. The ease of getting in touch with the 1.2. Year of birth: registry office? 1.3. Highest educational attainment: r Very dissatisfied r Dissatisfied r Satisfied r Very satisfied r Primary education r Do not remember r Secondary education 2.1.3. Overall waiting times at the facility; waiting times on Short-cycle tertiary education r  the phone or by email r Bachelor’s degree or equivalent r Very dissatisfied r Dissatisfied r Satisfied r Very satisfied r Master’s degree or equivalent r Do not remember r Doctoral degree or equivalent 1.4. Professional situation: If you went in person/face-to-face, how satisfied were r Working (full-time, part-time, or self-employed) you with the following? r Homemaker 2.1.4. Opening hours r Retired r Very dissatisfied r Dissatisfied r Satisfied r Very satisfied r Unemployed r Do not remember r Student 2.1.5. Time it took you to reach the facility r Other: r Very dissatisfied r Dissatisfied r Satisfied r Very satisfied r Do not remember Section 2: Access 2.1.6. Physical layout of the facility Finding the relevant contact information and getting in r Very dissatisfied r Dissatisfied r Satisfied r Very satisfied touch with the registry office r Do not remember 2.1.1. How satisfied were you with the ease of finding the correct website/address/contact person? Using e-government/digital procedures r Very dissatisfied r Dissatisfied r Satisfied r Very satisfied 2.1.7. Are you aware of the fact that duplicates of birth r Do not remember certificates, wedding certificates and death certificates can be requested and delivered online through the national e-government portal? 49 2.1.8. Would you consider using this online channel in 3.2. Are service delivery standards in line with future? expectations? r Yes r No 3.2.1. If you call with a request, what is a reasonable 2.1.9. If not, for what reasons? Please check all that apply. amount of time to wait before speaking with a registry I do not know how to use online tools/I am not familiar with it r  offier? I prefer personal contact r  r None r 30 seconds r 1 minute r 2 minutes r 3 minutes Things get done more easily and/or more quickly face-to-face r  r 4 minutes r 5 minutes r Longer than 5 minutes I am worried about the protection and security of personal data r  3.2.2. When you visit the registry office, how many on the Internet minutes is it acceptable to wait before speaking to a The service’s website or application might have technical r  registry officer? problems r 1 minute r 2–4 minutes r 5–9 minutes r 10–14 minutes Other (specify): r  r 15–19 minutes r 20–24 minutes r 25–30 minutes r Longer than 30 minutes Section 3: User-Centered Service Delivery 3.2.3. When you email or send documents electronically and Responsiveness to a government office by 10:00 a.m., what is a reasonable 3.1. Receiving timely service amount of time to wait before receiving an electronic reply? 3.1.1. How much time passed between the moment r 1 hour r 4 hours r Same day r Next day r Within 3 days you requested a service and the moment you considered r Within a week r Longer than a week your problem solved? r Up to 5 minutes r Up to 15 minutes r Up to 30 minutes Section 4: Reliability and Quality of r Up to 1 hour r Up to half a day r Up to a day r Up to 1 week Service Delivery r Up to 2 weeks r Up to 1 month r Up to 3 months r Up to 6 months r Up to 1 year r Not yet resolved 4.1. Interacting with staff r Do not remember To what extent do you agree or disagree with the following To what extent do you agree or disagree with the following? statements? 3.1.2. I was satisfied with the time it took to get an answer 4.1.1. Staff were polite to me. to my initial query. r Strongly disagree r Disagree r Agree r Strongly agree r Strongly disagree r Disagree r Agree r Strongly agree 4.1.2. Staff treated me fairly. r Not applicable r Strongly disagree r Disagree r Agree r Strongly agree 3.1.3. Overall, I was satisfied with the amount of time it 4.1.3. Staff paid extra attention to me and went out of their took to get the service/to deal with my query. way to get me what I needed. r Strongly disagree r Disagree r Agree r Strongly agree r Strongly disagree r Disagree r Agree r Strongly agree r Not applicable 4.1.4. Staff were knowledgeable/competent regarding the subject matter. r Strongly disagree r Disagree r Agree r Strongly agree 50 | Indicators of Citizen-Centric Public Service Delivery 4.2. Receiving clear, high-quality information Section 5: Final Comments To what extent do you agree or disagree with the following 5.1. In your view, what should the public sector’s priority statements? area be in terms of improving public service delivery? 4.2.1. I received high quality information/advice. Check one. r Strongly disagree r Disagree r Agree r Strongly agree r Simplify access to services 4.2.2. I received all the information/advice I needed in one r Improve quality of services interaction. r Reduce cost of services r Strongly disagree r Disagree r Agree r Strongly agree r Improve staff behavior r Improve timeliness 4.2.3. The information/advice was provided in clear, simple r Improve online services language. r Reduce corruption r Strongly disagree r Disagree r Agree r Strongly agree r Reduce red tape and paperwork 4.3. Completing the procedure r Other (please specify): To what extent do you agree or disagree with the following 5.2. Do you have any additional comments, suggestions, statements? questions, or concerns you would like to share? 4.3.1. The process was straightforward and easy to Please elaborate. understand. r Strongly disagree r Disagree r Agree r Strongly agree 4.3.2. The process was easy to complete. r Strongly disagree r Disagree r Agree r Strongly agree 4.3.3. The process required little paperwork. r Strongly disagree r Disagree r Agree r Strongly agree 4.3.4. I had to manually fill out too many forms. r Strongly disagree r Disagree r Agree r Strongly agree 4.4 Reaching a satisfactory outcome 4.4.1. Thinking about the entire experience, how satisfied Thank you for completing the were you with the service you got? questionnaire. r Very dissatisfied r Dissatisfied r Satisfied r Very satisfied 4.4.2. Was the service provided better or worse than you expected? r Much worse r Worse r Better r Much better 4.4.3. Would you recommend using this service to another citizen? r No, not at all r Not really r Yes, probably r Absolutely Appendix 2. Citizen Survey as Customized by a Municipal Registry Office | 51 Appendix C. Advantages and Disadvantages of Various Surveying Methods Advantages Disadvantages Face-to-face/in-person interviews • Positive identification of respondents • More costly than other modes of data collection (e.g., address-based sample) (in terms of time, money, travel and human resources • Possible higher response rates and lower drop-out rate due to required) personal contact between interviewer and respondent • Spatially restricted • Enables use of interviewing aids such as information cards • Answers may be filtered or censored • Enables use of longer, more complex questionnaires • Repeated attempts to contact respondents can be expensive • May enable more privacy than other modes • May afford less supervision of interviewers than telephone • Can motivate participants interviewing • Questions can be clarified • Concerns for privacy or lack of anonymity may result in lower • Question sequence is controlled response rates, especially on sensitive topics • Vague responses can be probed • Cultural and social conditions may also constrain the use of face-to-face interviewing. For example, in small communities, interviewers may know respondents • Interviewer’s presence may influence respondents’ responses, thereby introducing bias into the survey results. (continued) 52 Advantages Disadvantages Telephone interviews • More cost-effective than face-to-face interviewing • Requires high telephone saturation nationwide or in region to • Enables repeated attempts to contact respondents at lower avoid creating a biased sampling frame cost • Cannot be as long or complex as face-to-face interviews • If conducted from centralized facilities, enables greater because both respondents and interviewers tire more quickly supervision of interviewers • Does not allow the use of visual aids • Affords greater anonymity which may encourage reporting on • Increased use of mobile phones may create problems for sensitive subjects creating sampling frames and conducting interviews • Eliminates need to cluster sample to reduce enumeration costs • Increased use of technology such as caller ID and call blocking • May enable more flexibility in arranging interview times may inhibit ability to contact respondents • Appropriate for service-specific surveys where there is a • Some categories of people will be systematically under- contact number for each person from which to draw a sample represented • Questions can be clarified • Number of responses in closed questions limited • Question sequenced controlled • Telephone surveys are becoming unpopular • Vague responses can be probed Self-administered interviews (e.g., mail-out–mail-back questionnaire) • Cost effective • More limited length and more limited complexity: questions • Affords more privacy and anonymity than other modes which should be brief and self-explanatory, construction and content may prompt a better response rate, especially for sensitive should be simpler to be easily understood by respondents who topics may not be familiar with the concepts the survey is attempting • Like telephone interviews, surveys need to be shorter than to convey or with questionnaire structures. face-to-face surveys and use mainly simple, ‘tick box’ types of • Generally have higher item non-responses and more questions to achieve a reasonable response rate. inappropriate responses than in interviewer conducted • Greater coverage area surveys • Time to consider response • Greater opportunity for respondents to opt out of participation • Interviewer cannot shape responses • Response rates tend to be low, and therefore require large numbers of questionnaires to be sent out. Mail out/mail back surveys require extensive enumeration period. This may also affect the representativeness of the achieved sample • Limited scope to ask qualitative questions • High risk that some citizen groups will be over or under- represented, such as those with language, literacy difficulties or with support needs • No control over who completes the survey • Interviewer cannot shape questions (continued) Appendix 3. Surveying Methods | 53 Advantages Disadvantages Internet-based questionnaires • Could reduce costs of processing data, can be very cost- • More difficult to achieve a representative sample: Requires effective high internet saturation nationwide or in region to avoid • Afford more privacy and anonymity creating a biased sampling frame • May facilitate asking more sensitive questions • Need to avoid survey fraud and capture of the survey by • Can allow for more detailed questions than shorter telephone interest or advocacy groups surveys • Interviewer cannot shape questions. • May be particularly useful when surveying specific target groups • Electronic surveys can have a high response rate for users which are easy to target through the internet • Respondents have more time to consider responses 54 | Indicators of Citizen-Centric Public Service Delivery References and Bibliography Afrobarometer. 2010. What Can the AfroBarometer Tell Us About Berntzen, Lasse. 2013. “Citizen-Centric eGovernment Services: Service Delivery in Africa? Briefing Paper 92. Afrobarometer. Use of Indicators to Measure Degree of User Involvement AmericasBarometer. 2014. Core questionnaire. Latin Ameri5can in eGovernment Service Development.” Paper Presented at Public Opinion Project. http://www.vanderbilt.edu/lapop/ the Sixth International Conference on Advances in Human- core-surveys.php. oriented and Personalized Mechanisms, Technologies, and Andrews, Rhys, and Steven Van de Walle. 2012. “New Public Services, Venice, Italy. Management and Citizens’ Perceptions of Local Service Bouckaert, Geert, and Steven Van de Walle. 2003. “Comparing Efficiency, Responsiveness, Equity and Effectiveness.” Measures of Citizen Trust and User Satisfaction as Indicators Working paper 7, Coordinating for Cohesion in the Public of ‘Good Governance’: Difficulties in Linking Trust and Sector of the Future (COCOPS). Satisfaction Indicators.” International Review of Administrative Andrews, R., G. A. Boyne, J. Law, and R. M. Walker. 2005. “External Sciences 69 (3): 329–44. Constraints on Local Service Standards: The Case of Bryson, J. M., B. C. Crosby, and L. Bloomberg. 2014. “Public Comprehensive Performance Assessment in English Local Value Governance: Moving Beyond Traditional Public Government.” Public Administration 83(3), 639–56. Administration and the New Public Management.” Public Arndt, Christiane, and Charles Oman. 2006. Uses and Abuses of Administration Review 74 (4), 445–56. Governance Indicators. OECD Development Centre. Carson, Richard. 2011. Citizen Centric Service: Changing the Way ———. 2010. Measuring Governance. Policy Brief 39. OECD Government Does Business. Toronto: Deloitte. Development Centre. Carson, Richard, and Suboh Abdelhamid. 2015. Service Delivery Australia Information Victoria, Department of Innovation, Trend Outlook: The Potential Future of Government Customer Industry and Regional Development. 2010. On the Road to Service Delivery. The Government Summit Thought Satisfaction: Using the Canadian Common Measurements Tool Leadership Series. The Government Summit/Deloitte. to Measure Satisfaction with Government Services. https://www2.deloitte.com/content/dam/Deloitte/ca/ Baig, Aamer, Andre Dua, and Vivian Riefberg. 2014. Putting Documents/technology/service-delivery-trend.pdf. Citizens First: How to Improve Citizens’ Experience and CCMD (Canadian Centre for Management Development). Satisfaction with Government Services. McKinsey Center for 1998. Citizens First. Canadian Centre for Management Government. Development. Charron, Nicholas. 2013. European Quality of Government Index 55 2013 Survey Questions. https://nicholascharron.wordpress. Eurofound (European Foundation for the Improvement of Living com/european-quality-of-government-index-eqi/. and Working Conditions). 2012. European Quality of Life Charron, Nicholas, et al. 2010. Measuring the Quality of Survey. Luxembourg: Publications Office of the European Government and Subnational Variation. Report for the Union. European Commission Directorate-General Regional Policy ———. 2013. Third European Quality of Life Survey—Quality of Directorate Policy Development. University of Gothenburg, Society and Public Services. Luxembourg: Publications Office Sweden: Quality of Government Institute, Department of of the European Union. Political Science. European Institute for Public Administration. 2006. The Common Charron, Nicholas, Victor Lapuente, and Lewis Dijkstra. 2012. Assessment Framework (CAF)— Improving An Organisation Regional Governance Matters: A Study on Regional Variation in Through Self-Assessment. European Institute for Public Quality of Government within the EU. Working Paper 01/2012. Administration. European Commission, DG REGIO. Foresti, Marta, and Leni Wild. 2014. Governance Targets and ———. 2015. “Mapping the Regional Divide in Europe: A Measure Indicators for Post 2015: An Initial Assessment. London: for Assessing Quality of Government in 206 European Overseas Development Institute. Regions.” Social Indicators Research 122 (2): 315–46. GovLoop. 2014. Your Citizen Engagement Checklist: 18 Strategies Christensen, Tom, and Per Lægreid. 2002. “Trust in Government— for Success. Washington, DC: GovLoop. The Relative Importance of Service Satisfaction, Political Grandvoinnet, Helene, Ghazia Aslam, and Shomikho Raha. Factors and Demography.” Working Paper 18. Stein Rokkan 2015. Opening the Black Box: The Contextual Drivers of Social Centre for Social Studies, Bergen University Research Accountability. New Frontiers of Social Policy. Washington, Foundation. DC: World Bank. CSD (Center for the Study of Democracy). 2015. Monitoring Heintzman, Ralph, and Brian Marson. 2005. “People, Service Anti-Corruption in Europe—Bridging Policy Evaluation and and Trust: Is There a Public Sector Service Value Chain?” Corruption Measurement. Sofia: Bulgaria. International Review of Administrative Sciences 71 (4), 549–75. De Weerd, Marga, Mireille Gemmeke, Josine Rigter, and Coen HM Government. 2007. How to Measure Customer Satisfaction—A van Rij. 2005. Indicators for Monitoring Active Citizenship and Tool to Improve the Experience of Customers. United Kingdom: Citizen Education. Amsterdam: Regioplan, Report for the Her Majesty’s Government. European Commission, DG EAC. Holmes, Brenton. 2011. “Citizens’ Engagement in Policymaking Dolnicar, Sara, and Bettina Grün. 2013. “‘Translating’ Between and the Design of Public Services.” Research paper 1, 2011–12, Survey Answer Formats.” Journal of Business Research 66 (9), Parliamentary Library, Australia. 1298–306. Hydén, Göran, and John Samuel (eds). 2011. Making the State DPER (Department of Public Expenditure and Reform). 2015. Responsive: Experience with Democratic Governance Irish Civil Service Customer Satisfaction Survey 2015 Report of Assessments. New York and Oslo: United Nations Findings. IPSOS MRBI. Department of Public Expenditure and Development Program. Reform, Ireland. Institute for Citizen-Centered Service. 2010. Common Duggan, Martin, and Cathy Green. 2008. Transforming Measurement Tool Case Studies: Using the CMT to Move from Government Service Delivery: New Service Policies for Citizen- Research to Results. Institute for Citizen-Centered Service. Centered Government. IBM Global Social Segment. ———. 2013. Understanding Dimensions of Client Satisfaction and EC (European Commission). 2012. “e-Government Benchmark Enabling Actionable Service Performance Management. Framework 2012–2015.” Method Paper (July). European Institute for Citizen-Centered Service. Commission. IRI (International Republican Institute). 2015. Ukraine Municipal ———. 2014. Eurobarometer 397 Corruption Report. 56 | References and Bibliography Survey. IRI. http://www.iri.org/sites/default/files/ with Public Services: “Kiwis Count” Annual Report. wysiwyg/2015-05-19_ukraine_national_municipal_survey_ ———. 2017a. “SmartStart Helping Parents and Babies Get Off to march_2-20_2015.pdf. the Best Start. Lessons Learnt from the First Cross-agency Irish Reform and Delivery Office, Department of Public Life Event Project.” Case Study. https://www.ict.govt.nz/ Expenditure and Reform. 2015. Irish Civil Service Customer assets/Uploads/DIA-SmartStart-Case-Study.pdf. Satisfaction Survey 2015 Report of Findings. IPSOS MRBI. ———. 2017b. “SmartStart—Making Life Easier for Kiwi Parents.” Ivanya, Maksym, and Anwar Shah. 2010. “Citizen-Centric Press release. http://www.scoop.co.nz/stories/PA1702/ Governance Indicators: Measuring and Monitoring S00399/smartstart-making-life-easier-for-kiwi-parents.htm. Governance by Listening to the People and Not the Interest O’Connell, Isobel Alice. 2000. “Building a Public Sector Groups.” Policy Research Working Paper 5181. World Bank, Benchmarking Framework for Citizen Satisfaction Results.” Washington DC. Submitted to the University of Victoria, Canada. ———. 2015. Towards Greater Objectivity in Governance OECD (Organisation for Economic Co-operation and Measurement: Second Generation Citizen-Centric Governance Development). 2009. Donor Approaches to Governance Indicators. Joint Vienna Institute, Michigan State University, Assessments 2009 Sourcebook. OECD Publishing. and Brookings Institution, World Bank and SWUFE, China. ———. 2012. Practitioner’s Guide to Perception Surveys. OECD https://msu.edu/~ivanynam/research/cgi2/Ivanyna_Shah_ Publishing. CGI2_2015.pdf. ———. 2013. Government at a Glance 2013. OECD Publishing. Kunicova, Jana. 2015. Building Trust in the Government One Text at ———. 2015. Government at a Glance 2015. OECD Publishing. a Time. Washington, DC: World Bank. Oracle. 2012. “Eight Steps to Great Customer Experiences for Le Masson, Bernard and Khalid Al-Yahya. 2014. Digital Government Government Agencies.” White paper, Oracle. Pathways to Delivering Public Services for the Future: A Peppers and Rogers Group. 2010. Integrated Public Service Comparative Study of Digital Government Performance Across Delivery: Achieving Efficiency While Delivering Exceptional Ten Countries. Accenture. Constituent Experiences. Peppers and Rogers Group. Matheson, Angela. 2009. The Public Sector Service Value Chain— PwC (PriceWaterhouseCoopers). 2007. The Road Ahead for Public Revisiting the First Link with BC Public Sector Works Units. Service Delivery: Delivering on the Customer Promise. Public Canada: British Columbia. Sector Research Centre, PwC. McKinsey. 2015. Implementing a Citizen-Centric Approach to Petty, Kate Reed. 2017. “Is It Time to Retire the Word ‘Citizen’?” Delivering Government Services. Public Sector Insights. Los Angeles Review of Books (blog). https://blog. Mizzell Lee. 2008. “Promoting Performance: Using Indicators lareviewofbooks.org/essays/time-retire-word-citizen/. to Enhance the Effectiveness of Sub Central Spending.” Reeves, Rachel, and Stephen Bruster. 2009. Better Together: Working Paper 5, Public Governance and Territorial Scotland’s Patient Experience Programme: Patient Priorities Development Directorate, OECD. for Inpatient Care. Report 5. Scottish Government Social Mungiu-Pippidi, Alina. 2015. Public Integrity and Trust in Europe. Research. Berlin: European Research Centre for Anti-Corruption and Republic of Turkey. 2016. Office of the Prime Minister Annual State-Building (ERCAS), Hertie School of Governance. Report 2015. Republic of Turkey. Natasa, Miklic, Marko Derca, and Branko Zibret. 2009. How Rodriguez-Pose, Andres, and Enrique Garcilazo. 2013. “Quality to Become a Citizen-Centric Government. Study of Public of Government and the Returns of Investment: Examining Institutions in Eastern Europe, A. T. Kearney. the Impact of Cohesion Expenditure in European Regions.” New Zealand Government. 2015. New Zealanders’ Satisfaction Regional Development Working Papers 2013/12. OECD Publishing. http://dx.doi.org/10.1787/5k43n1zv02g0-en. Santiso, Carlos, Jorge von Horoch, and Juan Cruz Vieyra. 2014. References and Bibliography | 57 Improving Lives Through Better Government: Promoting Services.” Research Study Conducted for the Office of Public Effective, Efficient, and Open Governments in Latin America Services Reform. and the Caribbean. Technical note. Inter-American UNDP (United Nations Development Program). 2015. From Old Development Bank. Public Administration to New Public Service—Implications for Scott, Rodney, and Ross Boyd. 2017. Interagency Performance Public Sector Reform in Developing Countries. Global Center Targets: A Case Study of New Zealand’s Results Program. for Public Service Excellence, UNDP. Collaborating Across Boundaries Series. IBM Center for the ———. 2016a. Citizen Engagement in Service Delivery—The Critical Business of Government. Role of Public Officials. Global Center for Public Service SGMAP (Secrétariat général pour la modernisation de l’action Excellence, UNDP. publique). 2015. Baromètre 2014 de la qualité de l’accueil : ———. 2016b. The Vietnam Provincial Governance and Public un accueil multicanal toujours plus satisfaisant. Secretariat- Administration Performance Index (PAPI 2015): Measuring General for Government Modernization, France. Citizens’ Experiences. UNDP. Thijs, Nick. 2011. From Citizen/Customer Satisfaction Measurement Van Ryzin, G. G., and S. Immerwahr. 2004. “Derived Importance- to Management. Maastricht: European Institute of Public Performance Analysis of Citizen Survey Data.” Public Administration. Performance and Management Review 27 (4): 144–73. Tinholt, Dinand, Niels van der Linden, Michiel Ehrismann, et al. World Bank. 2014. Strategic Framework for Mainstreaming Citizen 2015. Future-Proofing eGovernment for a Digital Single Market. Engagement in World Bank Group Operations. Washington, Final Insight Report. European Commission, Directorate- DC: World Bank. https://openknowledge.worldbank.org/ General of Communications Networks, Content and handle/10986/21113 License: CC BY 3.0 IGO.” Technology. https://ec.europa.eu/futurium/en/system/files/ World Bank. 2015. Enabling Citizen-Driven Improvement of Public ged/egovernmentbenchmarkinsightreport.pdf. Services: Leveraging Technology to Strengthen Accountability Transparency International. 2012. Slovakia: Cities Ranked on Their in Nigerian Healthcare. ICT4SA project. World Bank: Transparency, Transparency International. https://www. Washington DC. transparency.org/news/feature/slovakia_cities_ranked_on_ World Economic Forum. 2012. Future of Government—Fast and their_transparency. Curious. How Innovative Governments Can Create Public Value ———. 2017. Monitoring Corruption and Anti-Corruption in by Leading Citizen-Centric Change in the Face of Global Risks. the Sustainable Development Goals—A Resource Guide. World Economic Forum. Transparency International. World Justice Project. 2015. Open Government Index Report. World Trapnell, Stephanie. 2015. User’s Guide to Measuring Corruption Justice Project. and Anti-Corruption. United Nations Development Program. ———. 2016. Rule of Law Index Report. World Justice Project. UK Cabinet Office. 2004. “The Drivers of Satisfaction with Public 58 | Indicators of Citizen-Centric Public Service Delivery