Work


Atmospheric modelling


My main activity at LISA is the development and maintenance of the Generator for Explicit Chemistry and Kinetics of Organics in the Atmosphere (GECKO-A). GECKO-A is a modelling tool which (i) generates detailled chemical schemes according to a prescribed protocol assigning reaction pathways and kinetic data and (ii) provide kinetic and thermodynamic properties on the basis of experimental data and structure-activity relationships. The generated schemas typically contain up to several million species and reactions. GECKO-A is the only tool allowing the automatic writing of such schemes. This tool relies on a protocol for identifying reaction pathways, experimental databases, and methods for estimating missing parameters. The development work of this code is carried out by adding new reaction mechanisms, updating the methods of estimation of physicochemical parameters, or updating the databases. Within the framework of the ANR MAGNIFY, the GECKO-A protocol has been updated (taking into account the aromatic structures and their chemistry, implementation of 5 new methods of estimation of kinetic constants with 5 publications from 2018 to 2022). The next step is now to finish cleaning the version and make it available to the community as open source.

The mechanisms from GECKO-A are then integrated into a box model. This 0D model allows to simulate the oxidation of organic compounds over time. Within the framework of the SPIM-CO (LEFE) and TREMOLO (IPSL) projects, I worked on its development, in order to be able to simulate urban plumes (preparation and use of meteorological data from regional models as input to the box model, recoding and updating of emission and deposition routines).

More informations, list of publications, and an online tool, can be found on the dedicated website.



Data treatment and analysis


Field campaigns and laboratory experiments generate a large amount of data. In this context, one of my activities consists in developing processing codes to (i) rework data (i.e. merging and/or reorganizing them from several sources), (ii) extract information from raw data and (iii) visualize them automatically.

      - In this context, I have developped several codes to automatically generate web accessible visualization for INDAAF monitoring stations (International Network to study Deposition and Atmospheric composition in Africa) supervised by LISA members. The scripts merge, rework, identify errors and missing data to generate treated csv files from the hundred of retrieved files, avoiding staff to do it manually. I am also in charge to create netcdf output files for distribution.
      - A new observation station has been recently (2022) implemented in Gobabeb in order to study aerosol optical properties and microphysics. These data are automatically sent on a cloud servor on a daily basis. From this I have written scripts to (i) retrieve data each day, (ii) treat data and (iii) generate quicklooks available on an internal webpage for lab's staff involved in the project.
      - LISA has an air quality monitoring station on the building roof in Créteil since 2021. This station has several instruments such as gaz analysers, FIDAS, weather station, ... I have been in charge of developping vizualition scripts for all theses instruments and make it available for the laboratory staff.



Others


I am involved in the IT department at LISA.