What is Policy Analytics? An Exploration of 5 Years of Environmental Management Applications

Our digital age is characterized by both a generalized access to data and an increased call for participation of the public and other stakeholders and communities in policy design and decision-making. This context raises new challenges for political decision-makers and analysts in providing these actors with new means and moral duties for decision support, including in the area of environmental policy. The concept of “policy analytics” was introduced in 2013 as an attempt to develop a framework, tools, and methods to address these challenges. This conceptual initiative prompted numerous research teams to develop empirical applications of this framework and to reflect on their own decision-support practice at the science-policy interface in various environmental domains around the world. During a workshop in Paris in 2018, participants shared and discussed their experiences of these applications and practices. In this paper, we present and analyze a set of applications to identify a series of key properties that underpin a policy analytics approach, in order to provide the conceptual foundation for policy analytics to address current policy design and decision-making challenges. The induced properties are demand-orientedness, performativity, normative transparency, and data meaningfulness. We show how these properties materialized through these six case studies, and we explain why we consider them key to effective policy analytics applications, particularly in environmental policy design and decision-making on environmental issues. This clarification of the policy analytics concept eventually enables us to highlight research frontiers to further improve the concept.


Introduction
The digital age has provided access to multiple sources of data and information for an increasing part of the world's population and has accelerated opportunities for their analysis, including through increased computational capacity. At the same time, the demand for opening policymaking processes to stakeholders, communities, and the general public has evolved into a generalized call for more inclusive and extensive participation, in some cases becoming entrenched in national or supranational regulations. This has often generated conflicting understandings of problems, driven by multiple bodies of expertise and knowledge on the same issues, which are embodied by diverse actors in society (see e.g., Arts et al. 2017). Since the expansion of environmental movements in the 1970s and 1980s around conservation and environmental protection, the environmental policy domain has long been a prominent arena for the tension between these two trends (increased information availability and calls for participation) (e.g., O'Donnell et al. 2019;Long 2019). However, the current digital age has rapidly exacerbated the availability of multiple, and at times contradictory, bodies of information.
This context raises new challenges and opportunities for innovatively engaging citizens in decision-making, and improving policy makers' capacities to intervene effectively in complex problems. In recent years, government actors have more actively sought to address both the opportunities and challenges of new demands and capabilities driven by technological change, as highlighted by the proliferation of various dedicated policy and legislative instruments, such as the General Data Privacy Regulation in Europe, and highlevel strategies developed by the United States, China, France, Germany, and Australia (e.g., Department of Industry, Innovation and Science ( Parallel to, and in support of these shifts, academic research is also seeking to formalize new models of decision support to environmental policies, to enable a productive interplay between the use of new information technologies and the enhanced public participation. Among these initiatives, policy analytics, as formalized in Tsoukias et al. (2013) and Daniell et al. (2015), provides a framework, tools, and methods fit for purpose. The term "analytics" has historically been used for decision support within individual sectors, with previous research focusing on areas such as "business analytics," "health analytics," and "learning analytics". Across these applications, the term "analytics" is understood as an umbrella term describing a variety of analytical methods and approaches with a sophistication that can match the complexity of the data types (both qualitative and quantitative), processing, and analysis demands of the digital age (Tsoukias et al. 2013). Tsoukias et al. (2013) wanted to promote the use of such "analytics" tools to address the public policy issues for which they may be relevant. However, Tsoukias et al. (2013) also stressed the relative difficulty of applying "analytics" within the public realm, mainly due to the unique constraints associated with decision support of public policies, in particular, the use of public money and the associated need for transparency, the prevalence of participatory and deliberative processes, and the nonmonetary and multifaceted nature of policy goals. To capture this twofold ambition, they defined "policy analytics" as a project to "support policy makers in a way that is meaningful (in a sense of being relevant and adding value to the process), operational (in a sense of being practically feasible) and legitimating (in the sense of ensuring transparency and accountability), [by drawing] on a wide range of existing data and knowledge (including factual information, scientific knowledge, and expert knowledge in its many forms) and [combining] this with a constructive approach to surfacing, modeling and understanding the opinions, values and judgments of the range of relevant stakeholders". This concept of "policy analytics" has aroused interest among many researchers in the environmental policy domain in recent years, with numerous discussions about its utility and possible improvements, and several applications in the field being held in different places around the world. This paper aims to draw on these discussions and applications to clarify the policy analytics concept so that its use and relevance can be clarified and expanded. To that end, we analyze a series of examples of concrete applications of the policy analytics framework to environmental policies. We first outline our methodological approach for clarifying the concept ("A Methodology to Rethink "Policy Analytics" as an Approach to Support Environmental Decision-Makers"). We then implement this approach ("Conceptualizing Policy Analytics: Lessons from 5 Years of Applications"). We present our series of case studies ("Examples of Applications"). We then articulate four normative properties that emerged from the discussions and comparisons of these case studies ("Properties of Applications of Policy Analytics"). These properties constitute the core of our proposed improved definition of policy analytics. Last, "Agenda for Further Policy Analytics Research" outlines avenues for future research on and around policy analytics.
A Methodology to Rethink "Policy Analytics" as an Approach to Support Environmental Decision-Makers In their context of launching a research dynamic, Tsoukias et al. (2013) proposed a deliberately wide definition of policy analytics in order to encourage discussions with a diverse and interdisciplinary group of researchers, policy officials, and data industry collaborators. This strategy proved effective, and a series of research projects were launched and developed, as part of an effort to develop and gain traction for the policy analytics concept and its application. However, this type of approach, which uses a more general definition to avoid excluding useful contributions, also has its limits, especially once the concept is mature enough to be compared with alternative frameworks.
As it happens, numerous other frameworks also attempt to address the challenges associated with developing public policy in a highly data-driven age, including "policy informatics" (Johnston 2015), "computational social sciences" (Lazer et al. 2009), "big data in public affairs" (Mergel et al. 2016), and "utilization-focused" and "systemic evaluation" of public policies (Midgley 2006;Boyd et al. 2007;Patton 2008). Shared among these various frameworks is the acknowledgement that our current information, communication, and technological environment is undergoing rapid changes, and consequently there is both a need and an opportunity for public policy to utilize the capabilities of changing information and communication technologies. Furthermore, these approaches also agree on the issues that will emerge from increased usage of data in both public and private settings, including questions around privacy, legitimacy, and accountability, and the need for new regulatory approaches that mandate certain standards in relation to these governance attributes.
As various research teams began to attempt real-world applications of the policy analytics concept, the lack of specificity in the definition prompted discussions on the definition itself, and on what made policy analytics unique from the alternative frameworks highlighted earlier. Various papers have proposed alternative definitions based on proposed clarifications of one or several of the criteria mentioned in Tsoukias et al. (2013). Jeanmougin et al. (2017) proposed to formalize Tsoukias et al. (2013)'s definition, using policy analytics as an evaluation framework applied to a conservation policy, by singling out four elementary criteria, associated with concrete examples. As compared with Tsoukias et al. (2013), this formulation retains the operationality and legitimacy criteria, but replaces the "meaningfulness" requirement, which they considered to be too vague, by two criteria referring, respectively, to a "scientificity" requirement and a requirement to bring in a demonstrable contribution. However, this clarification focused on a specific usage of the policy analytics framework (as an evaluation tool) and applied to a specific context (i.e., conservation policies). Jeanmougin et al. (2017) also highlighted the difficulty substantiating the "legitimacy" requirement at the core of the policy analytics framework. Meinard (2017) attempted to clarify this requirement by proposing an open-ended list of legitimacy criteria, but here again this attempt was focused on the specific context of conservation policies. Interestingly, some of the criteria proposed referred to the scientific credentials of the policies whose legitimacy was being evaluated, highlighting that the four criteria proposed by Jeanmougin et al. (2017) are not completely independent.
Although this interdependency between some of the criteria constituting the definition is not necessarily a fatal flaw, a definition based on independent criteria would certainly be clearer. In the same vein, Choulak et al. (2019) briefly discussed the vagueness of the operationality criterion.
The need to clarify the definition and the risks associated with too rigid definitions were discussed in numerous internal seminars among researchers in the group, based on applications of various versions of the framework based on a broader variety of policy issues, including the abovementioned environmental issues but also public health problems (Richard et al. 2018) and public management issues (Touret et al. 2019). In the wake of theoretical work clarifying the difference between tools, methods, and approaches in decision-support theories and practices , these discussions pointed to the conclusion that policy analytics is neither a field (such as, e.g., policy analysis) nor a tool nor a methodology (such as, e.g., focus groups or other participatory tools), but rather an "approach" to decision support intended for actors in public policy decision-making. Following Meinard and Tsoukias (2019), we use the term "approach" here to refer to "a way by which [an analyst] conducts a [decision support] process". A given approach can be applied to different issues, which can belong to different academic fields, and it can make use of a variety of methodologies, which can themselves be used by different approaches. In this understanding, which is anchored in Habermas's epistemological views (Habermas 1985(Habermas , 1990, "approaches" are defined by normative properties that specify key aspects of the way analysts should use available tools and methods.
This view of policy analytics as an approach embodying normative properties opens avenues to complement the topdown definitional approach used in these previous works by identifying, through a bottom-up procedure, normative properties, to some extent shared by exemplary case studies, which could be considered to provide an addition to the definition of policy analytics. Because the case studies explored below were performed with policy analytics in mind, they can be seen as partial but complementary attempts to clarify an underlying ambition shared by all the researchers who decided to gather under the banner of "policy analytics".
In this dynamic, during a workshop in Paris in 2018, a series of examples of policy analytics applications to environmental policies have been shared and discussed by participants. These applications provided the empirical material to venture a formulation of key properties, in an abductive approach (Peirce 1966). This formulation was then used in a reconstructive approach to rationalize some key aspects of the applications. The results of this reconstruction are presented in the next section.
We should emphasize at the outset that efforts to clarify the definition in this way are not doomed to constrain the potential of the concept, as Tsoukias et al. (2013) feared. As long as the definition remains open-ended and open to discussion and improvements, attempts to refine it can usefully clarify the underlying ambitions of different policy analytics research programs and provide directions for future investigations.

Conceptualizing Policy Analytics: Lessons from 5 Years of Applications
Using the methodology delineated above, in the present section, we start by describing the six case studies that were discussed in the 2018 workshop ("Examples of Applications"). The descriptions are all organized in the same way: we start by explaining the context (what is the policy at issue, what are the processes engaged) (1). We then explain the reasons why the researchers involved conceived of their works as applications of the policy analytics concept. Because, as explained in the previous section, the original definition of policy analytics was quite open, these reasons were disparate and, very often, focused on quite different interpretations of the concept (2). We then describe the data produced and/or analyzed (3). We finish by summarizing the outcome of each policy analytics application (4).
Following this description of the case studies, we articulate the four normative properties that emerged from the discussions and comparisons of case studies, which we propose as candidates to structure an improved definition of policy analytics ("Properties of Applications of Policy Analytics").

Examples of Applications
Case 1: Elaboration of a Wetland Prioritization Platform (1) The first case involved the elaboration of an operational wetland prioritization platform in Bourgogne-France-Comté (Choulak et al. 2019) that would be seen as legitimate by its key stakeholders. Wetlands are ecosystems whose functioning is largely determined by water, such as swamps, alluvial forests, bogs, etc. These ecosystems are the target of numerous conservation policies around the world, including the RAMSAR convention, and dedicated legislation in France. Wetland prioritization is a crucial step in most action plans devoted to conserving or restoring wetlands in line with these policies. It consists of using available data on wetlands (e.g., ecological features, hydraulic functions) and the context (e.g., urbanization dynamics, land use) to decide on which wetlands managers should prioritize. In 2017, the "wetland taskforce" ("Pôle Milieux Humides") of the Bourgogne-Franche-Comté region (France)-a team within a nonprofit environmental organization (Conservatoire d'Espaces Naturels)-was entrusted to elaborate a spatialized database on wetlands by a consortium of regional-tonational-scale institutions funding environmental actions. It was to focus on the whole regional scale based on a new prioritization methodology that would also need to be elaborated.
(2) Relevant databases available for prioritizing wetlands are large and heterogeneous, and very often standard practices tend to conflate very different kinds of data indiscriminately. Some of the databases house quantitative scientific data such as the results of hydrological models or data on the abundance of a given species. Others have political aspects and may include different forms of qualitative and quantitative information, such as zoning maps produced through political processes. Tsoukias et al. (2013) emphasized the importance of taking into account the nature and design of data to provide relevant and legitimate decision support. The researchers involved in this case study therefore saw standard practices in wetland prioritization as an example domain in which policy analytics could make a difference, by developing methods that give importance to the nature of the data they use and their design.
(3) The data used were the contents of the spatialized database elaborated by the wetland taskforce, and all the metadata corresponding to the methodologies used to capture these data, which we used to develop rules to aggregate parts of the information in the database using a rule-based approach (Azibi and Vanderpooten 2002). An example of a rule in this context was "if there is no indicator in the database testifying that a given wetland plays a role in flood mitigation, then this wetland is assigned to the category "No information in the database suggesting that it is suitable, even poorly, to pursue this objective to conserve wetlands performing a flood regulation function". A rule-based approach consists in identifying a consistent set of such rules allowing information in the database to be aggregated. To design these rules, we worked with representatives of wetland managers, who collectively identified a series of management objectives that they deemed they had political legitimacy to choose. We then used a rule-based aggregation method and MR-Sort, a noncompensatory aggregation method (Leroy et al. 2011), to produce a framework that the wetland taskforce will be able to use autonomously.
(4) The concrete outcome is a platform with which the wetland taskforce will be able to prioritize wetlands for managers, in a legitimate and fine-tuned way, thereby fulfilling the promise to add value and strengthen legitimacy by paying particular attention to the nature and design of data. The increased legitimacy stems from the fact that, whereas standard practices in wetland prioritization indiscriminately conflate technical choices (concerning, for example, the reliability of this or that indicator) and political choices (e.g., choices of objectives to pursue), this platform makes a point not to preempt the latter (see Choulak et al. 2019 for more details). The platform has been applied to several projects over the past few months (Melanie Paris, personal communication), and regional-scale funding institutions are interested in applying this new method at a larger scale. From a theoretical point of view, our main contribution is the notion of "meta-decision analysis". This notion stresses that, while researchers in decision sciences can provide decision support to decision-makers in some contexts, many other actors, such as consultants, experts, and stakeholders, can play the role of "decision support providers". Instead of providing decision support to a particular decision-maker facing a particular problem, a researcher involved in "meta-decision analysis" will strive to identify and help legitimate "decision support providers" to help decision-makers (see Choulak et al. (2019)). Metadecision support is, in our view, a corollary of the emphasis on legitimacy championed by authors in the policy analytics space.

Case 2: Facilitating Dialog over a Marine Pollution Dispute
(1) The second case study relates to the "red mud" conflict in the Calanques National Park (South France). In Marseille, there is an enduring dispute about waste disposal in the Mediterranean Sea, which is supposedly forbidden by the Barcelona convention of 1992. A factory has had a longterm special dispensation allowing it to dispose of massive quantities of residuals of the transformation of bauxite-the so-called "red mud". This pollution is considered illegitimate by a part of the population and creates a strong political conflict, although most people also acknowledge that the jobs provided by this factory are vital for the area. Despite public worries, the administration believes that all has been done to improve practices-but there is no communication among opposing worlds and thus no reduction of political conflict, and as a result, the main argumentative discussions take place in judicial courts.
(2) In this context, the data available on past and current disputes are numerous (e.g., reports by experts and consultants, surveys by journalists, scientific studies, and data from monitoring programs). However, in this deeply conflictual context, some of these data can be easily manipulated, and tracing back the biases that might have plagued them is hazardous. This is why the researcher involved in this case study saw it as an especially potent illustration of the idea, stressed in Tsoukias et al. (2013), that in such a complex context, sui generis processes are required to generate reliable data.
(3) A role-playing game was co-produced with local inhabitants, environmental associations, political decisionmakers, and representatives of the factory to represent a range of points of view and values in a single format. Based on long interviews, cognitive maps that brought together the definition of problems, actors, and possible actions were produced. Last, three participatory techniques were used to help structure debates: a serious game, participatory theater, and the co-construction of a research project between researchers and activists. The serious game initially aimed to create debate but was transformed into an education game because the field study itself created too much tension. It has been used in diverse contexts in the region since then, but never with a group of people in serious conflict. Artists then developed a theater play to organize discussion forums where opponents to the factory involved scientists, and the general public met and generated new discussions about the problem and the possibilities for solving it. Eventually 50 interested people were invited to co-construct a new research project about the multiplicity of forms of pollution and their circulation in the area, so as to raise awareness of the red-mud issue and evaluate the vulnerability of the territory.
(4) The outcomes of this case study confirm the fruitfulness of developing sui generis tools generating entirely new data, in a context in which analyzing existing data would be methodologically questionable. The continued adaptation of the choice of participatory techniques and their implementation in this case helped to better understand the diversity of points of view. Contradictory normative views concerning social priorities could be characterized and discussed, which facilitated communication among opposing worlds. The co-constructed knowledge production has strengthened links between scientists and associations, who in parallel have found representatives able to interact regularly with the administration. Public trust in the administration was thereby strengthened and administrations renewed their interest in creating arenas of dialog. However, the political problem lingers on.

Case 3: Facilitating Reflection on a Collaborative Water Management Network
(1) The third case focuses on the construction of collaborative environmental networks in the Gironde estuary (New Aquitaine, South West France) (Boschet and Rambonilaza 2017). In the context of the Water Framework Directive (WFD) and its implementation at the local river basin scale, as well as the Birds and Habitat Directives (Natura 2000 sites), several participatory mechanisms have been introduced. At the same time, local decision-makers have expressed their wishes to orient the future development of the riparian municipalities around the preservation and enhancement of natural and heritage resources, in an area that has historically had vocation as an industrial port. The major challenges were the lack of links between the two shores of the estuary, and a lack of visibility for the group of stakeholders who deal with the environmental issues of the estuary.
(2) One of the most important ideas emphasized by Tsoukias et al. (2013) in their introductory definition of policy analytics is that public policy contexts make it particularly difficult to use the sophisticated techniques typically associated with so-called "analytics". This is due to the fact that these sophisticated techniques are difficult for stakeholders and decision-makers to understand, whereas in public policy contexts, transparency, participation, and deliberation play a key role. The researchers involved in the present case study saw this context as an opportunity to test if it is possible to meet both policy analytics ambitions, by putting some sophisticated analytic techniques-in this case network analysis and statistical models-to use to help actors understand their interactions and to coproduce new interactions.
(3) The case study involved an ex-post analysis of the functioning of collaborative environmental governance and the main factors explaining how collaboration relationships form, and an assessment of the heterogeneity and representativeness of the stakeholders involved, as recommended by WFD (Art. 14). Data collection used documentary sources to identify representatives of organizations and count their participation in four policy processes in the Gironde estuary (514 individuals representing 386 organizations). A two-mode network methodology and preliminary field survey was used to define the population of interest ("the actors who act"). Then a final survey of this population produced data covering their exchanges of information, expertise, and resources, as well as the names of the people who are members of their network, who were themselves interviewed afterward. The interviewees were asked to name the network members who are the most important in the estuary's environmental management, then in a second step to name their actual partners, leading them to distinguish their understanding of the whole network and their personal network of collaboration. The survey, which followed a snowball sample methodology, was halted when no new names were mentioned by the respondents. These questions were integrated into a broader interview grid, which highlighted the interviewees' perceptions of opportunities and barriers to working with potential partners. The use of data (the actors involved and their relationships) first provided the current state of the collaborative network: the actors and their links, their position in the network, and the diversity of exchanges (financial, informational, contractual, informal…). A second step, which used statistical models of networks, consisted of assessing the factors facilitating or enabling collaboration links. In particular, the distance between the actors was systematically analyzed. By "distance," we mean not only physical distance, but also institutional distance (the positioning of stakeholders in relation to the rules governing the management of environmental issues), organizational distance (the principles that dictate the involvement of stakeholders within participation devices), and finally statutory distance (the specificity introduced by the roles devolved to the political and administrative apparatus via the statutes of the actors, elected or bureaucrat). The outcome was a visualization of the collaborative network.
(4) This work makes several contributions, illustrating how analytics tools can be put to use in a public policy context, despite the prima facie contradiction between the complexity of these tools and the requirements of participation. It provides a robust representation of the current state of the group of actors involved and a factual proof of the separation between the two shores in terms of collaboration, and cognitive support to the actors involved in terms of their social working environment. It also helped the Gironde and Charente local administration ("Conseils Départementaux"), and the "Syndicat Mixte" of the Gironde Estuary, to rebuild the collaborative network of actors mobilized around environmental stakes in the estuary. It is also a renewal of the political economy analysis of the implementation of environmental policies at the local level. This work also forced some actors to acknowledge the inertia of some networks of interaction, and its adverse implications. This eventually enticed them to encourage the arrival of new entrants, particularly economic players who have developed activities related to the estuary's heritage.

Case 4: Water Management Policy Design
(1) This case study deals with water management in the agricultural system of the Apulia Region (Italy), characterized by policy resistance that hampers the implementation of a water protection policy. Due to the limited availability of water resources, the agricultural activities are characterized by the combined use of both surface and groundwater. Groundwater overexploitation depletes water quantity and quality, leading to long-term social and environmental problems, including restrictive groundwater measures according to the WFD (Portoghese et al. 2013). The policies implemented in the area aim either to improve the efficiency of groundwater use through innovative irrigation techniques or to restrict groundwater use through policies and a tight control of farmers' activities (Giordano et al. 2015). Based on a traditional policy-making approach, this policy was developed without considering the potential impacts on the stakeholders, creating a strong conflict between stakeholders. This case study hence represents an emblematic example of the complexity of water management, where decision-makers with competing objectives and values need to share the same resource. A limited understanding of the different problem framings can be a source of conflict, hampering the implementation of and/or reducing the effectiveness of environmental policies (Giordano et al. 2017). Stakeholders act as if the decision space was as simple as they presume it to be (i.e., ignoring the role of some of the other actors and/or making assumptions about their decisional processes). A detailed description of the case study and the analysis of the ambiguity in problem framing can be found in Giordano et al. (2017) and .
(2) By highlighting the distinctive challenges involved in trying to use "analytics" tools in public policy contexts, publications on policy analytics provide a partial explanation of the fact that sophisticated decision-support methods tend to be poorly used at least in some public policy contexts. It occurred to the authors involved in this case study that their context of defective water management policies illustrated this idea. They therefore took this context as an opportunity to try to fulfill the corresponding promise of policy analytics, which is to put state-of-the-art decisionsupport tools to use in a complex and conflictual public policy context.
(3) The data-generating work focused on the policy design process (i.e., design of policy alternatives), using an innovative participatory approach. Mainstream policy tends to neglect the generation of novel policy alternatives and is more concerned with evaluating known alternatives (Ferretti et al. 2018;Pluchinotta et al. 2019). The experiences carried out in the Apulia case study supported the application of the Policy-KCP (P-KCP) participatory tool for the design of policy alternatives, integrating Decision science and Design theory. P-KCP is a Concept-Knowledge theorydriven tool (a design theory), adapted to the design of abstract objects such as public policies. The P-KCP aims to formalize the innovative design of policy alternatives within a public decision-making process. The P-KCP supports the creation of a shared artifact (Ostanello and Tsoukiàs 1993), further motivating stakeholders' engagement and commitment to a participative policy-making process. The steps of the P-KCP participatory tool are described in Pluchinotta et al. (2019). The P-KCP participatory tool assisted policy makers and stakeholders to work together to generate policy alternatives and overcome the difficulties of traditional approaches. The phase of knowledge elicitation and alignment (P-K phase) represents the starting point for building a shared concern, toward a generative phase (P-C phase). The P-K phase supported identification of the state of common knowledge on groundwater protection and surface water management problems, including the quali-quantitative state of aquifers and the analysis of the different stakeholders' problem framing (Giordano et al. 2017). The knowledge elicitation activities were carried out by integrating scientific and technical pieces of evidence available in the literature with expert and local knowledge according to participatory work principles. The results of semistructured interviews structured in mental models were combined with the outputs of the stakeholders' analysis and scientific literature studies, available data, emerging technologies, best practices, and current policies.
(4) The main outcome of this study was the pilot application of an original approach for the innovative design of policy alternatives, illustrating how a state-of-the-art decision-support tool can be implemented in a complex and conflictual public policy setting. The proposed methodology (P-KCP), integrating Decision Science and Design theory, formalized the policy design process and supported the generation of previously unimaginable policy alternatives. It connected local and expert knowledge within the whole design process, thanks to the construction of a collective problem understanding (i.e., a shared concern). It brought together stakeholders, experts, institutional, and noninstitutional actors, aiding them to find new ways of working together efficiently, generating innovative possible alternatives, and encouraging longer-term thinking. As a result, we observed that policy design can be a generative process for the creation of a new dimension of values, through the creation of new variables and/or the elimination of variables having little value for the process. For example, within the case study, we were able to introduce new alternatives in order to modify the value structures in a successful policy making.

Case 5: Decision Support for Catchment Management
(1) This study deals with a collection of decision-support processes involving modeling for integrated catchment management and the stakeholders of these catchments, carried out by a team of researchers at The Integrated Catchment Assessment and Management Centre at the Australian National University over the past few decades (see Merritt et al. (2017) for an overview of some applications). Integrated Water Resources Management is a widely recognized paradigm for making more inclusive policy decisions regulating the many, often competing, users of water; however, without effective decision support or "policy analytics," the promise of the paradigm is hard to realize. Focusing on a typical situation, a project is developed in partnership with water management authorities in Australia through co-creation of a research topic, informed by both opportunities identified by the university and available resources and priorities of the agency. To ensure legitimacy of the decision-support processes and models, a steering committee is used to provide feedback, in addition to having close involvement from government personnel and landholders.
(2) While some of the collection of work in this case study predates discussion of the expression "policy analytics," the researchers involved consider the use of analytical tools to support policy decision-making eminently aligned with policy analytics, notably through the use of participatory techniques combined with integrated modeling; the projects typically satisfy all four normative principles defining policy analytics, as listed in the next section.
(3) A typical project merges data and information from stakeholders and science through participatory processes and integrated modeling. Modeling provides a natural means for organizing and integrating economic, ecological, hydrological data, qualitative stakeholder input, and interviews. An iterative process is adopted (Jakeman et al. 2006), recognizing that design of both participatory processes and integrated models needs to be purpose-and context-driven, but that new information arises over time that requires changes to the project plan (Lahtinen et al. 2017). Data used in the construction of models and that from their resulting outputs play an important role in water management in understanding biophysical processes and anticipating the impact of policy or management measures. Integrated modeling then helps to tie economic and ecological outcomes with hydrological processes and intervention measures. Workshops to gain a common understanding of the system are supplemented by interviews targeting sector-specific understanding of agriculture and ecological outcomes. A pragmatic model-building approach is used, involving representing systems at the required level of complexity and mixing methods for different model components in order to best integrate knowledge of decisionmakers, multiple expert disciplines, and on-the-ground stakeholders. A spatially semidistributed hydrological model provides information at key points and aggregate regions, reducing the risk of information overload for users, and allowing for interactive use of the model. Uncertainty in outcomes is dealt with using scenarios and Bayesian Networks (Kelly et al. 2013;Maier et al. 2016), which have typically received positive feedback from users. The result is inherently interdisciplinary, such that communication within the project plays an important role.
(4) Project outcomes are delivered both through the stakeholder engagement process and the produced model and decision-support tool. The stakeholder engagement process facilitates social learning and shared problem framing, as well as building trust in the model. The model provides cross-sectoral estimates of the impact of various water policies and management interventions, in a transparent, traceable manner that the stakeholders can critique and discuss. Both the engagement process and produced tool then influence regulatory and agricultural decisionmaking processes. Importantly, there is no ex ante expectation that the model or outputs are directly referenced in decision-making. The project is understood to be one of many sources of information that decision-makers draw on. Shaping understanding of the situation is the main priority, along with adjusting different stakeholders' views of how the world operates and their relationships to each other, which makes evaluation of this type of policy analytics project particularly challenging (Hamilton et al. 2019).

Case 6: Participatory Revision of a Water Management Plan
(1) This study focuses on the participatory process used to revise a water management plan in the Drôme river valley, located in southeastern France. The river is managed by a basin institution and a local water committee. The basin institution is in charge of coordinating stakeholders, facilitating the local water committee, and carrying out construction and maintenance work. The local water committee is in charge of developing, revising, and monitoring the implementation of the river management plan. The first river management plan in the Drôme was established in the mid-90s (the Drôme was the first river basin in France to establish a river management plan). This plan was revised for the first time between 2007 and 2013. For the second revision, starting in 2018, policy makers were willing to use an innovative approach, by enabling citizens to make concrete proposals that would then be examined by the local water committee for inclusion in the revised river management plan. This participatory process was supported by a European project, SPARE (Strategic Planning for Alpine River Ecosystems, co-financed by the European Union via Interreg Alpine Space), and by international researchers. As a result, between November 2016 and October 2018, 344 citizens were involved in the (i) launching of the process, (ii) design of the process, (iii) participatory diagnosis of the river basin, (iv) identification of main stakes of the river basin and proposing of actions, and (v) synthesis of the results. In total, 62 participatory events were organized over 2 years.
(2) The researchers involved saw this context as an opportunity to explore an aspect of the ambition heralded by policy analytics: how a large amount of data could be gathered and analyzed in a participatory context, in such a way as to improve the decisions made by policy makers by anchoring them in new data, while monitoring the involvement of participants in the process.
(3) The various steps of the process produced a large amount of data, including 85 initial questionnaires about citizens' perceptions of the river and of participation, 630 contributions to the citizens' diagnosis, 189 propositions of actions, 3 action plans, 1 final report, 5 thematic syntheses sent to the local water committee, and answers to 78 questions asked by citizens. In addition, the participatory process itself was monitored and evaluated to provide data about the composition of the participants' group, its representativeness, and the retention level of participants (whether participants stayed throughout the whole process or left part way through). Data were collected by researchers, facilitators, and participants themselves. A group of 16 citizen volunteers contributed to data framing and collection. Data analyses were made by researchers and policy makers while the process was underway.
(4) The project facilitated a better understanding of the opinions, values, and judgments of participants: for example, the 85 initial questionnaires provided data about citizens' perceptions of the river and of participation (see results in http://www.alpine-space.eu/projects/spare/en/ pilot-case-studies/drome/charts). The participatory diagnosis outlined what participants liked or disliked in the river basin, what they considered needed to be conserved or modified, what data they lacked, and what questions they had. The results were also used to support policy makers, at two levels. First, data produced by participants fueled the revision of the water management plan. It highlighted issues that were important to citizens and that had been to date left out by policy makers, a number of which were subjective, such as the importance of the landscape, and attachment to the territory. It also allowed an analysis of who was present during the process and those who were absent. For instance, since the process attracted mainly people over 65 in its initial phases, an online participatory tool was set up for the action proposal phase so that working people and parents could participate as well. As a result, 52 additional participants contributed. Adapting the process in real time illustrated how data gathering and analysis can be included in the participatory process, rather than being postponed to the end the participatory phase. Following a similar adaptive logic, the analysis of the participants' group composition also fueled the reflexivity of the group of participants who wondered whether they were legitimate to make decisions about the river if they were not representative of the population. Finally, the project strengthened the policy process in the sense that all the data produced were proofread by participants and then put online, thereby improving the overall transparency of the policy-making process (the results were presented during participatory events and available online on a forum that was set up purposefully: https://sites.google.com/site/dromenjeu/). As a result, newcomers could see what had been produced by the group when they joined the process, and participants could promote and share their productions.

Properties of Applications of Policy Analytics
As detailed in "A Methodology to Rethink "Policy Analytics" as an Approach to Support Environmental Decision-Makers," discussions and reflections on the above case studies (and additional ones that are not detailed here, such as Kana et al. (2014) and Raboun et al. (2019)), led to the collective identification of normative properties that, we claim, should accompany applications of policy analytics. The case studies explored above do not specifically embody all these properties since they were not designed with these properties in mind. Rather, they were motivated by publications and discussions on policy analytics or by ideas that featured prominently in such discussions. The properties in this section were thus identified ex post from the collective analysis of these case studies. Future works embodying our four normative properties will demonstrate what we now consider to be important attributes for policy analytics approaches. The first two properties are concerned with capturing the specific aspects of policy analytics associated with its anchoring in decision analysis. The other two are meant to outline policy analytics features associated with its application to public policies.
We do not claim that each one of these properties is entirely novel for public policy studies. Many studies could rightfully claim that they satisfy one of these properties, and there might even be applications that satisfy several of them. Our claim is that a study that satisfies them all materializes the ambition underlying the policy analytics research program. P1: Demand-orientedness. Our experiences in the different case studies above showed us that, in most cases, the fact that our academic initiatives could easily respond to a demand voiced by actors in the field was key to fulfill the ambition of co-producing solutions with decision-makers. In the various cases in which the project was directly and explicitly requested by an institution or an actor (the wetland taskforce and, ultimately, the consortium of waterrelated institutions in "Case 1: Elaboration of a Wetland Prioritization Platform", the local regional administration in "Case 3: Facilitating Reflection on a Collaborative Water Management Network", various water management authorities in "Case 5: Decision Support for Catchment Management", and the basin institution in "Case 6: Participatory Revision of a Water Management Plan"), this strengthened the involvement of various actors in the decision process, including of course the ones issuing the request but others as well. In the other cases ("Case 2: Facilitating Dialog over a Marine Pollution Dispute" and "Case 4: Water Management Policy Design"), although the project stemmed from an initially academic questioning point of view, the fact that they were addressing problems that actors deemed important played a key role, which was demonstrated by the fact that various actors ultimately endorsed the questioning as their own. This suggests the importance of endorsing the normative idea that the justification of, and motivation for, an application of policy analytics should not be purely academic, and should be anchored in a real demand, voiced by actors, groups, or institutions in the field. This does not always mean that the demand should pre-exist and be voiced by an actor or institution already enjoying a form of authority: it can be created as the research project unfolds, which can take time. But in that case, the created demand will qualify as a demand properly speaking, and the study will qualify as demand-oriented, if and only if there are actors or groups or institutions who end up endorsing this demand and making use of the approach and its outcomes. This theoretically disqualifies academic studies that do not respond to an actual use case, even if they claim to respond to a generic "societal demand". We note that there will be much useful academic work required that may be precursory to being able to apply policy analytics approaches in a demand-oriented manner, such as algorithm development and other methodological developments, and that in such situations, the distinctions between good theory development and praxis in any application-focused academic endeavor are inherently fuzzy. P2: Performativity. By promoting operationalization and the importance of coproduction, policy analytics stresses that decision-support interventions should not be purely academic, and should rather feed concrete applications, leading to improvements of the situation they study. This idea played a key role in all of our case studies: in "Case 1: Elaboration of a Wetland Prioritization Platform", the outcome was a new prioritization tool that the decision-aiding provider will use on a daily basis in its interactions with wetland managers, which will inevitably lead to concrete changes in their conservation strategies and in the concrete restoration actions they will implement. In "Case 2: Facilitating Dialog over a Marine Pollution Dispute", the project deployment led to the construction of an active debate arena, enabling discussions among concerned populations to be reorganized. The analytical results in "Case 3: Facilitating Reflection on a Collaborative Water Management Network" helped to guide future actions of decision-makers in association with the actors of the collaborative network, leading to the emergence of a new "policy trajectory". In "Case 4: Water Management Policy Design", the study designed new policy alternatives, which will be included in and enrich existing policy-making processes. In "Case 5: Decision Support for Catchment Management", water managers in numerous settings used the results of the modeling exercise to inform and make planning decisions. In "Case 6: Participatory Revision of a Water Management Plan," the intervention led to process adaptations as illustrated by the online participatory tool set up for the action proposal phase. In all cases, this direct link with applications played a key role in ensuring the relevance and operationality of the approach. This suggests the following normative property: the aim of applications of policy analytics should not simply be to describe or analyze states of affairs or processes; it should be to support actions that will encourage improvements of these states of affairs and processes, ideally in new and positive directions. This application-focused aspect is what we call "performativity". This excludes purely descriptive approaches. However, it does not exclude integration of descriptive substudies within a policy analytics project.
P3: Normative transparency. Our various case studies show that, when trying to fulfill a particular aspect of the initial policy analytics' ambition, we were all led to work out our own normative assumptions and forced to clarify and display them. This involves among others: reflexively identifying or choosing the role that analysts have in their interactions with decision-makers (illustrated in particular in "Case 1: Elaboration of a Wetland Prioritization Platform"), analyzing and improving existing decision-aiding structures ("Case 3: Facilitating Reflection on a Collaborative Water Management Network"), analyzing and modifying when needed the set of stakeholders, concerned citizens, and various experts that are involved in the decision process ("Case 6: Participatory Revision of a Water Management Plan"), and analyzing the broader significance of the results of the study, and its chosen boundaries, to identify if and how they can support more generalized conclusions ("Case 1: Elaboration of a Wetland Prioritization Platform," "Case 3: Facilitating Reflection on a Collaborative Water Management Network", "Case 4: Water Management Policy Design," "Case 5: Decision Support for Catchment Management," and "Case 6: Participatory Revision of a Water Management Plan"). This requirement was present from the start in "Case 1: Elaboration of a Wetland Prioritization Platform", since the data were specifically selected and aggregated in such a way as to prevent any risk that some actors might think that the method used preempted legitimate political or other value-laden choices. In "Case 2: Facilitating Dialog over a Marine Pollution Dispute", normative considerations did not take center stage at the beginning of the project, but because the first results unveiled clashes of normative frameworks among the actors concerned, the need to be transparent with respect to the normative underpinning of the methods used ended up playing a key role. In "Case 3: Facilitating Reflection on a Collaborative Water Management Network", "Case 4: Water Management Policy Design", "Case 5: Decision Support for Catchment Management", and "Case 6: Participatory Revision of a Water Management Plan," the participatory aspects of the studies similarly led to the emergence of a diversity of value frames, which had to be taken into account on an equal footing, thereby forcing our own interventions to be transparent with respect to their normative anchorage. With the benefit of hindsight, this idea appears crucial, since it conditions our ability to support decision-makers in their own attempts to be transparent and accountable, in particular in their interactions with decision-support providers (be they researchers, consultants, or in-house policy analysts). This suggests the following normative property: applications of policy analytics should clarify, display, and account for their normative underpinnings, both in terms of the points of view taken into account and in terms of how interactions between analysts, decision-makers, and stakeholders unfold. This property excludes, for example, welfarist economic, public management approaches and others that do not make explicit their ethics and values-based assumptions.
P4: Data meaningfulness. The term "analytics," in "policy analytics," was purposefully chosen to emphasize that one of the most important (if not the most important) ambitions of policy analytics is to reinforce the importance of reflecting on the nature and meaning of data used to support policies. The general availability of numerous and sometimes large datasets that characterize our digital age means that large quantities of data can be easily accessed and computed. But information on the context that has led to the emergence of these data, the protocols used, their intrinsic limits, and the paradigms that should accompany their interpretation are often forgotten in this process. Devictor and Bensaude-Vincent (2016) and Jaric et al. (2019) provide detailed examples of the problems that this can create for environmental policies, as data are computed and interpreted in questionable ways. Several of our case studies were motivated by attempts to master the whole process of data generation and analysis needed to overcome such problems. In "Case 1: Elaboration of a Wetland Prioritization Platform", data were specifically selected and aggregated in different ways, depending on how stakeholders understand them. The choice of aggregation methods was then dictated by the interpretation of the data shared among acknowledged experts, and known or suspected associated uncertainties and knowledge gaps, which involved avoiding commonly used, more mechanistic weighted-sum methods that silence these features of data. In "Case 2: Facilitating Dialog over a Marine Pollution Dispute", the methods used guided the data collection rather than the other way around. In "Case 3: Facilitating Reflection on a Collaborative Water Management Network", the data were constructed with the actors with a continuing attention to how various actors or groups understood them. In "Case 4: Water Management Policy Design", the P-KCP participatory tool (Pluchinotta et al. 2019;Giordano et al. 2020) assisted collaboration between policy makers and stakeholders, connecting local and expert knowledge within the whole design process, thanks to the construction of a collective problem understanding (i.e., a shared concern). Similarly, in "Case 5: Decision Support for Catchment Management" and "Case 6: Participatory Revision of a Water Management Plan", participants were encouraged to contribute to data framing and collection. In all the cases, the data meaningfulness issue hence appears crucial, and the ex-post analysis even suggests that it could have played a more central role. This is why we champion the following normative property: the analysis of the nature and meaning of data, determined by their context of emergence, protocols used, intrinsic uncertainties and limits, and the associated paradigms, should all play a key role in any application of policy analytics. Notice that this requirement does not prevent including, and even advocating for, gathering experience on the go, for example, through using realtime sensor feeds or logbooks. These tools are meaningful for both reflexive ex-post analysis and formative tracking of system impacts, providing some immediate reflexivity or "feedback" to be used in the policy process itself, for example, to identify a particular threshold that may be crossed.
The four properties articulated here can thus be seen to provide a concrete shape to the promise of policy analytics approaches, including to allow them to tackle a number of challenges associated with digital age and participation, as spelled out in the introduction. Data meaningfulness (P4) aims to reduce the risk of policy makers feeling overwhelmed by data, whose analysis can end up being entirely beyond their control, as well as to allow them to benefit from messy or unstructured data produced through participatory processes. Normative transparency (P3) can similarly be seen as a safeguard to prevent decision processes from being captured by blackbox models and policy processes that obfuscate the actors and their stakes or interests in them. These two properties can be seen as two constraints on decision-support activities that, in what might seem to be a paradox at first glance, are at the same time all the more important and all the more difficult to abide by in the digital age. The importance and difficulty of the challenge justifies the need for not just incremental improvements in policy analytics practice, but also major, disruptive innovations in policy making. These can only be delivered by ambitious research activities rethinking the very structure of decisionsupport science and practice. This is epitomized again by the emphasis on learning in P3 (normative transparency), while emphasizing that the innovations produced should have impacts in real life (P2, performativity) and fulfill real needs or demands rather than emerging from purely theoretical whims (P1, demand-orientedness).
Based on this analysis, we claim that these four normative properties should be understood as a definition for a bone fide application of policy analytics. Our case studies were not elaborated with these four normative properties in mind. Rather, as explained in our methodology, they were elaborated with the ambition articulated by policy analytics in mind. Specifically, the properties were ventured ex post, through a structured collaboration process of discussion and case study analysis, so as to strengthen applications of policy analytics in the future. The six case studies therefore do not all materialize the four properties to the same degree. The four properties, however, arguably account for important aspects of all six case studies, and point to areas where each could have been ideally improved to lead to a greater policy impact.

Agenda for Further Policy Analytics Research
As the above account illustrates, we conceive of the development of policy analytics as a dynamic project. It was launched as a conceptual contribution, but its contours are being refined as more and more practical applications have been uncovered from past practice, recently implemented with the policy analytics concept in mind, and subsequently stimulated reflection and prompted adjustments to both policy analytics theory and praxis. This paper attempted to capture the core ideas and motivations underlying recent applications and developments of the concept. However, the resulting picture should not be seen as a final description, but rather as a step in a continuing dynamic, whereby we hope to further improve the framework in the years to come through new applications to what we see as emergent, challenging, and pressing issues. In this final section, we would like to emphasize a handful of the major issues that could structure a useful research agenda for the policy analytics community in the near future to support it to achieve its ambitions. The connection of each research frontier to the properties spelled out above (P1-4) is also briefly discussed.
Our examples above highlighted the importance of participatory approaches in demand-orientedness (P1). Accordingly, fully implementing this property raised challenges pertaining to stakeholder selection issues, which have been an important research topic for a long time for researchers concerned with engineering participatory processes and participation in policy decisions (e.g., Daniell 2012; Nabatchi 2012). The works developed by policy analytics researchers allowed important advances in the design of participatory processes and continuous diffusion of data and information through these processes so as to ensure transparency, relevance, and informed decisionmaking. However, as the process unfolds, the boundaries of the issues tackled and problem formulations can evolve. Due to this evolution, the group of stakeholders initially selected can become incomplete or partly irrelevant at a given stage of a policy-support process. Similarly, a choice made initially concerning the process design, e.g., the participatory methods selected or the roles assigned to some participants, may no longer be relevant later given this evolution. There is therefore a need to identify technologies or procedures to (1) facilitate coevolution of the participants involved and of the process design, while (2) keeping a memory of previous dialogs, achievements, and evolutions. This is a major research frontier for which policy analytics' distinctive interest in data analysis and meaning-giving provides value through collection and use of data generated throughout these participatory processes.
We have also seen above that participatory aspects of policy analytics projects play an important role in fulfilling the requirements associated with data meaningfulness (P4). Accordingly, another research frontier for the design of participatory processes is to elaborate means of identifying the data and information that the various participants need to meaningfully participate in the decision. Thinking more fundamentally about the notion of data, how data are created, modified, circulated, and reused out of initially designed contexts is also an important challenge, echoing the importance that policy analytics grants to data meaningfulness (P4). This reflection also has aspects concerning data sovereignty and ownership, and what this means for policy analytics under different jurisdictions. In particular, policy analytics could integrate reflections about issues of power linked to ownership and diffusion of data, or lack thereof. There are also links to issues of data privacy and accessing environmental-related data about people, and how the use of this should be managed. Likewise, the challenges of what streams of data can be meaningfully and ethically integrated to provide full (but perhaps too full) a picture of people, their values, interests, and preferences are highly topical as governments and corporations look at their data assets and their perceived underuse (e.g., Löfgren and Webster 2020). More generally speaking on the area of participation linked to policy analytics, and already reported in the literature (Mazri et al. 2019;Daniell et al. 2010), the design of participation structures is itself a topic of participation, requiring design methodologies where participation is pragmatically considered. Data, when used within complex and long decision processes, are generally subject to several manipulation processes. Assuring the quality and meaningfulness of the entire data pipeline is today a major challenge for the whole area of data science (Christophides et al. 2019). An additional critical issue concerning the policy analytics topic is how to introduce innovation within public policies, for example to conceive of currently inconceivable policies. The most promising ideas come from joining analytics with formal design tools, allowing the emergence of "out of the box" designs (Howlett 2011; Pluchinotta et al. 2019), and in some cases a healthy dose of considering science fiction and the cutting edge of artistic inspiration as an option set worthy of formal investigation (Johnson 2011;Wenger et al. 2020).
Important research frontiers also concern how to implement normative transparency (P3) in a formalized, rigorous fashion. In this area, formal argumentation theory in artificial intelligence (Rahwan and Simari 2009) holds important promise to help improve discussions around policy analytics interventions. However, the possibility to use these approaches in this setting raises important epistemological and methodological questions that they do not yet tackle. In particular, if these approaches are used in real-life collective decision processes, they will have to answer questions such as who has the legitimacy to decide which arguments should be seen as good arguments, and which ones should be considered spurious, and how transparency can be guaranteed in argumentation processes? Cailloux and Meinard (2019) proposed a preliminary formulation of a framework designed to overcome this (and other) limitation of such approaches. Important challenges also lie in a proper integration of such tools in the proceedings of discussions among people or groups, and the reflection of individuals involved, which remain the core of what normative transparency refers to.
An associated issue, having to do with next-generation algorithms (e.g., AI), is related to what metrics are considered relevant when used as part of policy analytics. For example, perhaps explicability of analytical processes and models is less relevant than legibility (Scott 1998) and trust. This is particularly important in automated/autonomous systems where decision-and policy makers may need to understand the different algorithms, data streams, and sensors, and hence trust each layer in the supply chain. What would useful policy analytics look like in such systems?
Last, a major concern for future research that has to do with performativity (P2), is the long-term sustainability of the policy analytics interventions. Policy analytics activities should arguably have long-term benefits and co-benefits. Hence, a future research avenue is to identify what makes policy analytics approaches more salient for long-term policy support and interventions in a variety of contexts.
Our six case study examples illustrate how the notion of policy analytics, in its original conceptualization, proved useful to explore important environmental issues and support environmental decision-makers for important decisions in the field. However, this agenda for future research in turn shows how developing the concept in a bottom-up approach, far from closing debates with a final definition, can help to structure future studies and open new research avenues to further strengthen environmental decision support and the application of policy analytics approaches more broadly.