Suman MaityJapan Agency for Marine-Earth Science Technology | JAMSTEC · Research Institute for Global Change
Working on CO2 Data Assimilation using MIROC4.0 at JAMSTEC, Japan
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
The prediction of extremely severe cyclonic storms has been a long-standing and challenging issue due to their short life period and large variation in intensities over a short time. In this study, we predict the track, intensity, and structure of an extremely severe cyclonic storm (ESCS) named 'Fani,' which developed over the Bay of Bengal region...
The prediction of an extremely severe cyclonic storm (ESCS) is one of the challenging issues due to increasing intensity and its life period. In this study, an ESCS Fani that developed over Bay of Bengal region during 27 April - 4May, 2019 and made landfall over Odisha coast of India is investigated to forecast the storm track, intensity and struct...
The study investigates the influence of near-surface atmospheric parameters on land surface processes at the land–atmosphere interface through the offline simulation of the 2D Noah Land Surface Model-based High-Resolution Land Data Assimilation System (HRLDAS). The HRLDAS is used to conduct sensitive experiments by introducing perturbation in the a...
This study investigated the precipitation and temperature climatologies over India from large ensemble (100 members) historical climate simulations in two recent past climate periods (1951–1980 and 1981–2010). The main focus was to statistically examine the usefulness of such large historical climate simulations by discussing (1) the precipitation...
Atmospheric nitrous oxide (N2O) contributes to global warming and stratospheric ozone depletion, so reducing uncertainty in estimates of emissions from different sources is important for climate policy. Here, we simulate atmospheric N2O using an atmospheric chemistry-transport model (ACTM), and the results are first compared with the in situ measur...
In this study, the interannual variability (IAV) of Indian Summer Monsoon (ISM) is investigated using multi-year (1982‒2016) seasonal scale simulations (May‒September) of the regional climate model RegCM4. Model-simulated fields such as surface temperature, wind and rainfall are validated initially to testify the climatological behaviour of ISM. Su...
In this study, the Interannual variability (IAV) of Indian Summer Monsoon (ISM) is investigated using multi-year (1982‒2016) seasonal scale simulations (May‒September) of the regional climate model RegCM4 developed by International Center for Theoretical Physics, Italy. Model simulated fields such as surface temperature, wind and rainfall are valid...
This study performed a land use and land cover (LULC) change analysis over Southern India for the period 1981–2006 from the normalized difference vegetation index (NDVI) images of AVHRR data and applied the “observation minus reanalysis” (OMR) method to investigate the impact of the LULC change on the temperature of the region. The LULC change anal...
The changes in land use and land cover (LULC) in recent decades are among the responsible factors for the recent climate changes. In this study, the impact of LULC changes on the climate over India was assessed through multi-decade simulations (3 decades) by using the Regional Climate Modeling system (RegCM4) with fixed and changed LULC. Difference...
Soil moisture is one of the key components of the land surface processes and a potential source of atmospheric predictability that has received less attention in the regional scale studies. In this study, an attempt was made to investigate the impact of soil moisture on Indian Summer Mon-soon simulation using a regional model. We conducted seasonal...
In this study, we explored the performance of the cumulus convection parameterization schemes of Regional Climate Modeling System (RegCM) towards the Indian summer monsoon (ISM) of a catastrophic year through various numerical experiments conducted with different convection schemes (Kuo, Grell amd MIT) in RegCM. The model is integrated at 60KM hori...
This study explores the influence of land-use and land cover (LULC) changes on the temperature over North India (NI) and North-Eastern India (NEI) during 1981–2006 by subtracting the reanalysis temperature from the observed temperature (observation minus reanalysis (OMR) method). The normalized difference vegetation index (NDVI) data of the AVHRR s...
The Land Surface Air Temperature (LSAT) climatology during the period of 1961-1990 and the anomalies (relative to the 1961-1990 climatology) have been developed over Pan-East Asian region at a (monthly) 0.5°×0.5° resolution. The development of these LSAT data sets are based on the recently released C-LSAT station datasets and the high resolution Di...
The Magazine describes an analysis of regional-scale concentrations of fine particles in India during wintertime.
In the present study, sensitivity of Indian Summer Monsoon (ISM) to the cumulus convection scheme (CCS) is assessed using Regional Climate Model (RegCM4.4.5). Seasonal scale (May–June–July–August–September) simulation of the model forced with European Centre for Medium Range Weather Forecasts reanalysis data (ERA interim) is carried out for the con...
In this study, the performance of the Regional Climate Modeling system version 4 (RegCM4) is evaluated over Indian regions through a 30-year climate simulation. The main focus is given on the model bias, correlation between the model simulated and observed results, and the standard error in the model simulations towards the 30-year mean temperature...
In the present work, the performance of six convective parameterisation schemes (CPSs) of Weather Research and Forecasting (WRF) model is tested for the intensity simulation of cyclonic storms Mala, Sidr, Nargis, Aila, Laila, Jal and Thane, which form over the Bay of Bengal (BoB), with a special focus over Jal (formed in November 2010). The study f...
The Regional Climate Modeling system (RegCM) is being widely used for climate simulation at seasonal to multi-decadal scales over many regions across the world in recent decades. In this study, we conducted a 10 years simulation using RegCM4 by replacing the United States Geological Survey (USGS) land use data (the default land use data in RegCM) w...
The Indian Summer Monsoon (ISM) is driven by organized large-scale convection; hence, its simulation is expected to depend on an appropriate representation of cumulus convection in the model. In the present study, the performance of different cumulus parameterization schemes is examined towards simulations of the ISM. The Regional Climate Model (Re...
In this study, an attempt has been made to investigate the sensitivity of land surface models (LSM) and cumulus convection schemes (CCS) using a regional climate model, RegCM Version-4.1 in simulating the Indian Summer Monsoon (ISM). Numerical experiments were conducted in seasonal scale (May–September) for three consecutive years: 2007, 2008, 2009...
The regional climate model (RegCM4) is customized for 10-year climate simulation over Indian region through sensitivity studies on cumulus convection and land surface parameterization schemes. The model is configured over 30° E–120° E and 15° S–45° N at 30-km horizontal resolution with 23 vertical levels. Six 10-year (1991–2000) simulations are con...
Firstly, I saw that probability density function is used in meteorology for comparison. For example, if I have two time series of same meteorological parameter from two different sources. Is it only used to justify how comparable the two data sets are? Is there any other scientific reason?
Secondly, how to do that?
Please have a look at the attached figure. I also attached a txt file containing bias. How to generate the figure like "Figure.png" using the data from "bias1.txt"? The figure seems to me that it is generated using NCL. I tried to follow the NCL guide ("pdfx") like the following script:
a = addfile("new.nc","r")
ts = a->ts
opt = True
opt@bin_spacing = 0.2
max_ws = max(ts)
nbins = floattointeger(((max_ws - min_ws)/0.2)+1)
ap = pdfx(ts,nbins,opt)
wks = gsn_open_wks ("png","pdf")
res = False
plot = gsn_csm_xy (wks, ap@bin_center, ap, res)
But I am getting a figure having many spikes not as smooth curve (normal curve) like the "Figure.png". How to proceed?
Any sort of suggestion/help will be highly appreciated. Please reply.
This is Dr. Suman from IIT Bombay, India. Recently, I was evaluating model performance using Index of Agreement (Mbienda et al., 2017). IOA is a skill metrics used for performance evaluation of model designed by Willmott et al., 1981 which is again redefined by Willmott et al., 2012 and used in Mbienda et al., 2017. I have the following question.
In general, statistical metrics used in meteorology may be calculated considering space as well as time (Example space/time correlation). While the same is true for IOA calculation? I mean while calculating, he summation will be taken considering space or time? In particular, during IOA calculation, there are two summation. Whether those summation will be calculated over all time period or all the spatial grid points?
Please reply at the earliest.
This is Suman from IIT Bombay. I am using ECHAM-HAM for my study. I saw the model accepts the oxidants field through "ham_oxidants_monthly.nc" which is linked with "ham_oxidants_monthly_T63L47_macc.nc". I have the following queries: 1) Is it customary to use "ham_oxidants_monthly_T63L47_macc.nc" for oxidants field? I mean if I want to use different oxidant data, may I do that? If yes what are the necessary steps? I attached a sample (having 2 time steps only) and I want to use for the model. check it and tell me what to do. 2) I saw "ham_oxidants_monthly_T63L47_macc.nc" is of having 12 months data i.e. an 8-year (2003-2010) climatology. I want to know whether the model accepts only climatological data for oxidant file? I mean, If I have 5 years monthly data (i.e. 5x12=60 months) and I want to use that for the model, may I need to prepare a 5 years monthly climatology? 3) I also noticed that "HO2" is not included in the default oxidant field (H2O2, NO2,NO3, O3, OH) in "ham_oxidants_monthly_T63L47_macc.nc". If I want to incorporate "HO2" in the oxidant field, then what is the possible for that? Please reply at the earliest.
Dear All This is Suman from IIT Kharagpur, India. I have a very silly query which described as follows. Any sort of insight is will be appreciated. Harmonic analysis is a basic mathematical technique used in various branches. In the field of meteorology and atmospheric sciences, we regularly use harmonic analysis for the elimination of annual (1st harmonic) and semi-annual (2nd harmonic) variation from different data. From the earlier literature it is noticed in most of the cases long term series of monthly data is used for that purpose. For example if there is a complete series of 30 (1951-1980) years monthly data (30*12=360 months), then to eliminate seasonal variability of the original data 1st, 2nd and sometime 3rd harmonics will be subtracted considering a Fourier series of 12 harmonics (for 12 months). Please make me correct it I am wrong. In my case, it is something similar but I don't have whole data as I am interested in seasonal study (Indian Summer Monsoon). I have also 30 years of data having only June, July, August and September. It means the data I have June, July, August of 1951, June, July, August of 1952 June, July, August of 1953 and so on. Now if I want to eliminate the seasonal harmonics then what to do? As I don't have data for all months, it is unscientific to use the above procedure for my data. Is the idea is only applicable for complete data (I mean having all months data)? Please help. Suman Maity Ph.D student IIT Kharagpur India.
This is Suman Maity from IIT Kharagpore, India. I wanna ask a very basic question which is as follows:
I was doing eof analysis on the rainfall data over Indian domain. For understanding, I have done it through a no. of software viz., grads, matlab, cdo etc. using the same dataset. Unfortunately they are not giving the same result. It creates confusion to me whether I am doing correctly. Since I am quite new in this field therefore having a little bit of understanding.
Please reply what is the possible reason for that and tell me easiest method of finding the eofs.
Any helps will be highly appreciated.
I have query regarding lagged composite of daily rainfall anomaly computation of the attached paper. I couldn't understand how they have computed?
Any kind of help would be appreciated.
As I know that standardized anomaly of a time series is calculated by subtracting its mean from the series and divided by its standard deviation. Let's suppose I have 30 years monthly dataset and I want to find the standardized monthly anomaly from it. Then I need to compute every month climatology from the data and subtract from every months data. But what will be the standard deviation in this case?
This is Suman from Indian Institute of Technology Kharagpur, India. Recently I was doing eof analysis of monthly rainfall data during 1951-1980 (12*30years=360 months). I have performed the analysis using command data operator (cdo) which is a popular tool in this field. I am not able to interpret the spatial distribution of the different eof mode. How to understand that what is physically means? For example, my 1st mode is explaining about 72% variances and shows similar kind of spatial distribution as the climatological mean during that period. I am in two minds that whether the result is correct or not? Eventhough it is correct what physical process it represents?
Any sort of guidance will of immense help to me.
I am using regcm for simulation. I can easily plot the topography but I need to plot the buffer zone of the model along with the topography. As there is not parameter in the regcm model output therefore how to draw that? My map projection is lambert conformal.
Any suggestions/comments will be highly appreciated.
I am using RegCM as my simulation tool. As we all know in any type of atmospheric model, parameterization is an important point to make representation of the subgrid scale information and out of them cumulus parameterization is the most important key factor. In every model, it contains no. of schemes for that and each having differences in different aspects. Although some stones can be thrown regarding their positive/negatives but fully correct justification is not possible. What I need is a good practical document having both theoretical and practical information on any one cumulus parameterizaion scheme. It discusses the formulation, equations, their solutions and computer code (fortran) to solve it.
I have tried a lot but couldn't found such complete document. Please reply.
Any kinds of help will be accepted.
I am using RegCM4 for my research study. I have a very basic query regarding closure approximation in the cumulus convection scheme. This is as follows:
As per my knowledge there is two closure approximation available in RegCM
1) Arakawa-Schubert (AS)
2) Fritcsh-Chappell (FC)
Also a no. of cumulus convection scheme (CCS) is available viz., Kuo, Grell, MIT, Tiedke and Kain Fritcsh scheme. My query is:
Did these closures have any impact on the chosen CCS except Grell? I have seen in some literature that the experiment was performed using Grell with AS closure and Grell with FC closure separately and proved that Grell with FC has better performance than Grell with AS and consequently these closure approximation has strong impact on the Grell scheme. My question is whether these closure approximation has impact on other scheme viz., Kuo, MIT, Tiedke and Kain Fritcsh scheme? or they are independent of closure approximation? Because I have never seen any article of this type. Specifically the experiment Tiedke with AS and Tiedke with FC different results and the same is true for MIT, Kuo and Kain-Fritcsh scheme? Please reply.
Any kind of information is highly accepted.