Danish Meteorological Institute
Question
Asked 26 October 2015
How I can extract future precipitation and climate data from CMIP5 simulation?
Kindly guide me how i can download CMIP5 simulation for emission scenario RCP 8.5? How i can extract the future climate data for a specific catchment?
Popular answers (1)
The basic step could be summarized:
- Get access to the data via one node of the Earth System Grid Federation (ESGF); see for example http://www.dkrz.de/daten-en/IPCC-DDC_AR5. You may have to create an account before you actually can download the data. If a colleague at your institute has done it before, ask him to save time.
- Prepare a data selection mask (details below), so that only the values of the catchment are taken into account.
- Process the data. Done
If you get the impression that the global data of the CMIP5 simulations are to coarse for your question of interest, you may consider to explore the data from the WCRP CORDEX archive, were regional models of higher spatial resolution are driven by the typical CMIP5 scenarios. See for example http://cordex.org
Steps 2-3 in more detail
Any how the general steps are similar regardless of the data set. I would first download a data set for a year, for instance, and the information where each grid point is located in terms of latitude and longitude. I would construct a mask, where each model grid point entirely in your catchment would get the value one (1) and all model grid points outside of the catchment would get the value zero (0). The grid points that are partly covered by the catchment could be treated in two ways. The simple one for first test is as follows: If the fractional coverage is above 50% use one else zero. In a later version you may use the more advanced way by exploiting the actual fractional coverage ranging from 0 (completely outside) to 1 (entirely inside) your catchment.
Afterwards you may apply the mask to your test data set and inspect if the results are plausible. Afterwards you may download the entire data set covering your period of interest and repeat the steps for the entire period of interest. In this respect you either use your favorite tool to perform the actions or you may consider trying the CDOs (climate data operators: https://code.zmaw.de/projects/cdo/wiki/Cdo). They are very handy if you operate with netCDF output from common climate models. For common Linux distributions they can be easily installed, for example in Ubuntu the command would be: apt-get install cdo (at least as long as you have administrative rights; if not ask your system administrator).
Once you have done the job for one model and want to compare the results to another model, you have to redo all the steps above, since the model grid is probably different among the models. Here scripting the entire process (csh, tcsh, bash, ksh, …) could save you a lot of work at the end. However the construction of the mask and the tests should be done for each model, before a script is started, else the results might be misleading and a lot of working hours are wasted without proper testing.
Good luck
4 Recommendations
All Answers (3)
Danish Meteorological Institute
The basic step could be summarized:
- Get access to the data via one node of the Earth System Grid Federation (ESGF); see for example http://www.dkrz.de/daten-en/IPCC-DDC_AR5. You may have to create an account before you actually can download the data. If a colleague at your institute has done it before, ask him to save time.
- Prepare a data selection mask (details below), so that only the values of the catchment are taken into account.
- Process the data. Done
If you get the impression that the global data of the CMIP5 simulations are to coarse for your question of interest, you may consider to explore the data from the WCRP CORDEX archive, were regional models of higher spatial resolution are driven by the typical CMIP5 scenarios. See for example http://cordex.org
Steps 2-3 in more detail
Any how the general steps are similar regardless of the data set. I would first download a data set for a year, for instance, and the information where each grid point is located in terms of latitude and longitude. I would construct a mask, where each model grid point entirely in your catchment would get the value one (1) and all model grid points outside of the catchment would get the value zero (0). The grid points that are partly covered by the catchment could be treated in two ways. The simple one for first test is as follows: If the fractional coverage is above 50% use one else zero. In a later version you may use the more advanced way by exploiting the actual fractional coverage ranging from 0 (completely outside) to 1 (entirely inside) your catchment.
Afterwards you may apply the mask to your test data set and inspect if the results are plausible. Afterwards you may download the entire data set covering your period of interest and repeat the steps for the entire period of interest. In this respect you either use your favorite tool to perform the actions or you may consider trying the CDOs (climate data operators: https://code.zmaw.de/projects/cdo/wiki/Cdo). They are very handy if you operate with netCDF output from common climate models. For common Linux distributions they can be easily installed, for example in Ubuntu the command would be: apt-get install cdo (at least as long as you have administrative rights; if not ask your system administrator).
Once you have done the job for one model and want to compare the results to another model, you have to redo all the steps above, since the model grid is probably different among the models. Here scripting the entire process (csh, tcsh, bash, ksh, …) could save you a lot of work at the end. However the construction of the mask and the tests should be done for each model, before a script is started, else the results might be misleading and a lot of working hours are wasted without proper testing.
Good luck
4 Recommendations
Queensland Government
For Australia-centric data I've been using http://nrm-erddap.nci.org.au. Most of which is global, but as it's a set of data derived from NOAA it may be possible to find more specific regional data suitable for you needs
Similar questions and discussions
Can freemeteo.com website be used as a valuable source for Meteorological data extraction?
Nabil Khorchani
I have found different historical climate data from freemeteo.com website so I would like to inquire if I can use it as a source to extract the data I need and make it as a reference within my article.
Related Publications
Zusammenfassung Gemeinsam mit dem japanischen Wetterdienst (JMA) überwacht der DWD seit An-fang 1999 monatlich und weltweit die Verfügbarkeit von bodennahen Klimadaten, ins-besondere von den Stationen, die zum GCOS Surface Network (GSN) gehören. Es zeigt sich, dass es zwar eine allgemeine Verbesserung der Verfügbarkeit gibt, diese aber noch lange n...
Climatic data for A. psilostachya