B. Quality management: an overview of basic concepts
19.6. Managing the quality of official statistics is vitally important to compilers’ success in maintaining the trust and confidence of their users and data reporters. Regular dissemination of information on the implementation of rigorous and clearly defined quality standards will also help users better understand and appropriately analyse the statistics and will ultimately raise the visibility and reputation of the compiling agency.
19.7. While there are several general definitions of quality, one of the most commonly used and succinct is fitness for use or fitness for purpose.
19.8. The development of the NQAF and the guidelines that accompany the Template was undertaken by the Expert Group on NQAF in response to a request by the Statistical Commission at its forty-first session in 2010. The NQAF Template is intended to be used as a tool to provide the general structure within which countries that choose to do so can formulate and operationalize national quality frameworks of their own or further enhance existing ones. Other international organizations have also developed data quality systems. Examples include the IMF DQAF, which is used to assess the quality of countries’ macroeconomic statistics and as a standard presentation of metadata within SDDS and the GDDS. Eurostat has developed a quality assurance framework (see box 19.1) that contains a total of eight quality criteria for which reporting is defined in the ESS Handbook for Quality Reports, while the “Quality framework and guidelines for OECD statistical activities” explicitly focuses the quality of the data used, produced and disseminated by OECD.
19.9. To ensure conformity in the use and interpretation of the quality dimensions by compilers of all datasets within the statistical framework statistics recommended by MSITS 2010 for measuring the international supply of services, all definitions in the present chapter are taken from the NQAF glossary, which was endorsed by the Statistical Commission as part of the NQAF guidelines.
19.10. The NQAF lists the following examples of common quality dimensions or components: relevance, accuracy, reliability, timeliness, punctuality, accessibility, clarity, interpretability, coherence, comparability, credibility, integrity, methodological soundness and serviceability. The dimensions of quality are overlapping and interrelated and, therefore, the adequate management of each of them is essential if information is to be fit for use. SDMX defines eleven quality dimensions: relevance, accuracy, timeliness, punctuality, accessibility, clarity/interpretability, comparability, coherence, integrity, credibility and methodological soundness.
19.11. First of all, compiled statistics should be relevant, meaning that they should meet current and potential users' needs. The compiling agency's challenge is to weigh and balance the conflicting needs of current and potential users to produce
statistics that satisfy the most important needs within given resource constraints. For a breakdown by mode of supply, such statistics should be produced for services items that are important for the compiling economy and should preferably be developed in cooperation with the users of such data (such as the ministries of trade, the economy or foreign affairs). The relevant services could be identified through direct dialogue with the major users or, for the case of data broken down by mode of supply, by examining the sector’s relative share of total exports/imports of services.
19.12. Accuracy, or the closeness of computations or estimates to the true values that the statistics were intended to measure, should also be ensured by compilers. Accuracy is usually characterized in terms of error in statistical estimates and is often decomposed into bias (systematic error) and variance (random error) components. The assessment of accuracy can contain either numerical measures of accuracy or qualitative assessment indicators. It may also be described in terms of the major sources of error that potentially cause inaccuracy (e.g., coverage, sampling, non-response or response error).
19.13. Statistics compiled within the framework for describing the international supply of services should also adhere to standards of timeliness, measured as the length of time between data availability and the event or phenomenon they describe. Timeliness is a crucial element of data quality, as it increases the statistical information’s relevance and its ability to be used effectively by policymakers. For statistics on trade in services, the data should preferably be produced at least on an annual basis. It is good practice for selected (e.g., aggregated) data series covering services transactions between residents and non-residents, FATS or additional indicators on the international supply of services to be produced and disseminated more frequently, depending on country’s needs and resources. The release date of data should also be punctual, in that it follows the target date announced in the official release calendar. Timeliness typically involves a trade-off between accuracy and cost.
19.14. Accessibility and clarity should also be ensured. Statistics on the international supply of services should be presented in a clear and understandable form, and disseminated through a suitable and convenient medium, with supporting metadata and guidance.
19.15. Furthermore, statistics on the international supply of services should be comparable across geographical areas (i.e., statistics should measure the same phenomenon for different geographical areas); over time (statistics should provide two or more instances of data on the same phenomenon measured at different points in time); and across domains (statistics should include the results from multiple surveys that target similar characteristics in different statistical domains). For data broken down by mode of supply, if a country is focusing on a particular type of service, a description in terms of the Central Product Classification (CPC) of the service would be useful. Moreover, internal coherence (or consistency) and coherence across domains should be ensured, as statistics are often obtained from different sources or based on different approaches, classifications and methodological standards. Metadata must convey information that will help any interested party to evaluate the comparability of the data, which is often the result of a multitude of factors.
19.16. Compiling agencies should also ensure integrity, or the values and related practices that maintain confidence, in the eyes of users, in the agency that produces statistics and, ultimately, in the statistical product. One important aspect of integrity is the ability of users to trust in the objectivity of statistics. Integrity means that professionalism should guide policies and practices and that those policies and practices are supported by ethical standards and transparency. Integrity is closely linked with credibility, or the confidence that users place in statistical products based simply on their image of the statistical authority (i.e., the brand image).
19.17. Compiling agencies should also ensure that sound methodologies are used to compile statistics that comply with the relevant international standards, including the professional standards enshrined in the Fundamental Principles of Official Statistics.
19.18. At the same time, cost effectiveness should also be ensured. For the compilation of statistics on the international supply of services broken down by mode, the mechanical allocation of EBOPS 2010 to modes of supply presents a strong advantage, as that method is relatively inexpensive because it is based on existing data and knowledge of the compiler. Finally, existing data transmission mechanisms and information technology tools should be used to the extent possible.
19.19. Quality management and its components Quality management is defined in SDMX as systems and frameworks in place within an organization to manage the quality of statistical products and processes. Quality management refers to the application of a formalized system that documents the structure, responsibilities and procedures put in place for satisfying users, while continuing to improve the data production and dissemination process. It also includes how well the resources meet the requirement. The concept can be broken down into “assurance”, “assessment” and “documentation”:
(a) Quality assurance refers to all the planned and systematic activities implemented that can be demonstrated to provide confidence that the processes will fulfil the requirements for the statistical output. That includes the design of programmes for quality management, the description of the planning process, the scheduling of work, the frequency of plan updates and other organizational arrangements that support and maintain planning function;
(b) Quality assessment contains the overall assessment of data quality, based on standard quality criteria. That may include the result of a scoring or grading process for quality. Scoring may be quantitative or qualitative;
(c) Quality documentation contains documentation on methods and standards for assessing data quality on the basis of standard quality criteria.
Next: C. Focusing on quality assurance
 Statistical agencies have arrived at a consensus that the concept of quality of statistical information is multidimensional and that there is no one single measure of data quality. Several statistical organizations have developed lists of quality dimensions, which, for international organizations, are being harmonized under the leadership of the Committee for the Coordination of Statistical Activities. A description of the activities of the Committee in this respect is available from http://unstats.un.org/unsd/accsub-public/data-quality.htm.
 See http://ec.europa.eu/eurostat/documents/3859598/6651706/KS-GQ-15-003-EN-N.pdf/18dd4bf0-8de6-4f3f-9adb-fab92db1a568.
 Available from http://unstats.un.org/unsd/dnss/QualityNQAF/nqaf.aspx.
 See SDMX, Content-Oriented Guidelines, annex 4, p. 109. Available from http://sdmx.org/wp-content/uploads/2009/01/04_sdmx_cog_annex_4_mcv_2009.pdf.