Innovation dans l'expertise
  • News
  • Le projet
    • The project
  • Governing by Prediction Conference
    • Program
  • Colloques de restitution
    • Modèles, prévisions et scénarios dans les politiques de l'énergie et de l'agriculture - 9 juin 2017
    • Produire la prédiction: le travail de modélisation pour l'évaluation des risques - 19 juin 2017
    • Sécurité et justice: le défi des algorithmes - 27 juin 2017
  • Séminaire & ateliers
    • Séminaire du projet 2015 -2017
    • Modélisation, simulation et prédiction dans l’action publique - 02/2015
    • Modeling, simulation, scenarization and algorithms in public policy and debate - 07/2015

Gérer, autant que prédire. Ce que font les algorithmes de police prédictive au travail et à l'organisation policière

2/15/2019

0 Comments

 
Bilel Benbouzid a publié à la fin de l'année dernière, dans le numéro spécial de Réseaux sur les machines prédictives (voir plus bas), un nouvel article issu de ses recherches sur la police prédictive aux Etats-Unis. Il y montre que "les machines prédictives sont des technologies morales de gouvernement. Elles servent non seulement à prédire où et quand les crimes sont susceptibles d’avoir lieu, mais aussi à réguler le travail de la police. Elles calculent des rapports d’équivalence, en distribuant de la sécurité sur le territoire, selon de multiples critères de coûts et de justice sociale. En retraçant les origines de la police prédictive dans le système du Compstat, on peut observer le passage de machines à explorer des intuitions (le policier garde la main sur la machine) à des applications qui font disparaître la dimension réflexive de la proactivité, faisant de la prédiction le support de métriques de « dosage » de la quantité du travail de la police. Sous l’effet d’un mouvement critique dénonçant les biais discriminatoires des machines prédictives, les développeurs imaginent les techniques d’audit des données des bases d’apprentissage et les calculs de la quantité raisonnable de contrôle policier dans la population.
0 Comments

Do forecasts matter in policy paradigm change? Stefan Aykut on competing predictive assemblages in energy policy in France and Germany

2/15/2019

0 Comments

 
In April 2019, the special issue on to foreknowledge in public policy that was developed by Stefan Aykut, Bilel Benbouzid and David Demortain in the framework of INNOX will be published by Science & Technology Studies. In the meantime, the first paper to appear in this special issue has been published online. In "Reassembling Energy Policy", Stefan Aykut shows that visions of policy futures are emerging from what he calls predictive assemblages. The term designates the fact that in a policy environment such as the energy policy sector, coalitions of actors are equipped with their own models and forecasts, which cohere, in turn, with a normative discourse about future developments in energy systems. Actors, models and discourses form the assemblage.

This original persective is particularly helpful to reveal the politics behind modeling and anticipation for policy: there are competing assemblages at any given time and country. Stefan compares the changing predictive policy assemblages in France and Germany from the 1960s to the present.
At the end of the day, Stefan teaches us how and to what extent models and predictions enable policy change, but also shows how to go beyong conventional accounts of the performativity of models in policy. As he says, "further research should not only focus on the effects of foreknowledge on expectations and beliefs (discursive performativity), but also take into account how new models equip political, administrative and market actors (material performativity), and how forecasting practices recompose and shape wider policy worlds (social performativity)."

The paper may be downloaded below.

aykut_-_2019_-_reassembling_energy_policy.pdf
File Size: 248 kb
File Type: pdf
Download File

0 Comments

The true 'in silico toxicology' revolution and the role of regulators in toxicity testing innovations

2/15/2019

0 Comments

 
In an insightful article about computer-based, in silico toxicity testing method, Jim Kling argues that "where there is sufficient data that is properly analyzed, in silico methods can likely reduce and replace animal testing. And even when the data is sparse, it can at least help guide the way." Kling must be commended for updating us on the latest developments in QSAR modelling or organ-on-a-chip technology, but perhaps more importantly, for going beyond the technological promisses of in silico testing and showing us empirically, instead, what in silico testing actually achieves in terms of prediction. As the research conducted in INNOX shows — several papers are forthcoming about QSAR, PBPK and other modelling techniques — in silico testing assembles with other information and knowledge. It does not replace experiment, but is mostly helpful to frame further experiments and exploits their results as much as possible.

One comment, though. The view that regulatory agencies are “slow to adopt these approaches” and need to be further “convinced to trust them” misrepresents the reality of innovation in toxicity testing. This is a common view indeed: regulatory agencies would be reluctant to take on board new kinds of data and studies, and prefer sticking to the conventional methods established in laws and guidelines. They are conservative, and make decisions only based on what animal experiments, still the gold standard, show. But this is only a part of the actual history of regulatory science, however, as far as the historical development of computational toxicology methods goes. It is difficult to under-estimate the role of the Office of Toxic Substances of the Environmental Protection Agency (EPA) in the initial realization, back at the end of the 1970s, that structure-based predictions could help in reviewing chemicals at a fast rate, and its responsibility in the development of a large database of ecotoxicological data on 600 chemicals to produce validated statistical models, or in the patient creation of a dedicated software to help chemical firms replicate structure-activity methods. Similarly, while ToxCast is a program of the Office of Research and Development of the EPA, the initial impulse of the head of pesticides and toxics office, realizing the need for faster chemicals screening methods, was instrumental in its launch.

Regulatory science, as its name indicates, is an intriguing mix of ideas and technologies emerging from academia, industry and regulatory agencies. In this ecosystem, regulators play an essential part, pointing to potential developments, asserting the criteria of validity of new methods, funding technological developments. In silico toxicology would not be where it is now without them.

0 Comments

"Machines à prédire", un numéro spécial de Réseaux par Bilel Benbouzid et Dominique Cardon pour comprendre le renouveau actuel de l'intelligence artificielle

2/15/2019

0 Comments

 
Machine learning, deep learning, réseaux de neurones... ces technologies de calcul prédictif n'ont rien de nouvelles, mais les formes que ce type de calcul prend aujourd'hui lui confèrent un caractère inédit. C'est ce que proposent de démontrer Bilel Benbouzid et Dominique Cardon à travers une sélection d'articles publiés dans la revue Réseaux à la fin de l'année dernière, et consacrés à ce qu'ils appellent les "machines prédictives": "des dispositifs calculatoires rationalisent le futur en le rendant disponible à des formes d’action préventives". Une des choses qui explique le renouveau de l'intelligence artificielle est l'existence de controverses sur ses formes passées, controverse qui ont conduit les concepteurs d'algorithmes à repenser leur utilité, et de là le type de prédictions produites. Comme Benbouzid et Cardon le résument, ce qui distingue cette intelligence artificielle est son encastrement dans les mondes sociaux et mondes organisés. Dans le régime d'anticipation actuel, "le résultat d’un calcul est satisfaisant s’il permet de faire fonctionner des machines utiles, davantage tournées vers l’action que vers l’explication des phénomènes". Chacun des articles de ce riche numéro spécial l'illustre. En ligne ici: https://www.cairn.info/revue-reseaux-2018-5.htm
0 Comments

    Archives

    July 2019
    April 2019
    February 2019
    October 2018
    May 2018
    March 2018

    Categories

    All

    RSS Feed

Powered by Create your own unique website with customizable templates.