I have recently shared a Python notebook using Google Colab (and GitHub). The code implements an entire workflow for climate science starting from data retrieving (seasonal forecasts and reanalysis) to the calculation of performance metrics (deterministic and probabilistic).
In the last years I needed many times to aggregate the data into a gridded dataset (for example ERA5 meteorological data) into a time-series, according to specific borders (for example administrative regions).
NOTE: this post has been updated. The previous code was based on the conversion of the GRIB files into NetCDF, which introduces unfortunately some issues.
Among the data products of the Copernicus Climate Change (C3S) available through the Climate Data Store, there is a collection of seasonal forecasts, from the 13th November consisting of five different models (ECMWF, UK Met Office, Meteo-France, DWD and CMCC).
We can easily say that the Copernicus Climate Change (C3S) initiative is definitely shaping the field of climate services. I might have said “Climate Science” instead of “Climate Services”, but I want to focus here on the applicative side of the climate science.