Science topic

Robustness - Science topic

Explore the latest questions and answers in Robustness, and find Robustness experts.
Questions related to Robustness
  • asked a question related to Robustness
Question
3 answers
Using SmartPLS4 in data analysis. I obtained two VIFs which are > to 10 and a HTMT which is equal to 1.07.
How to justify adequately these values and which robust and relevant references can be used to persuade the reviewers?
Thank you
Relevant answer
Answer
Thank you very much Vasilica Maria Margalina and Judit Albert
the constructs i'm using are reflective. Judit Albert can you please explain more "Justify only if the constructs are theoretically expected to be closely related". My focus is set on finding relevant references to argument and justify the high values of 2 constructs of my model. Thanks
  • asked a question related to Robustness
Question
3 answers
Is flood forecasting for the rivers of the European continent a scientific operation? Can a human mechanism be created to accurately predict floods in the European continent?
Pluvial flooding is a result of overland flow and ponding before the runoff enters any watercourse, drainage system or sewer, or cannot enter it because the network is full to capacity, usually caused by intense rainfall. River and coastal floods get the most attention since they are largest and last the longest, while pluvial floods are relatively marginalized in research. Therefore, the main goal of this research was to show risk posed by pluvial floods, their connection to current global climate change processes, present effects of flooding in European cities, as well as what we can expect in the future. Furthermore, the aims were to present and get more familiar with scientific projects, strategies, directives and measures devised both on national and international levels, that deal with urban pluvial flood issues across the European continent. Climate change projections indicate that there will be an increase in the frequency and intensity of rainfall events throughout Europe and along with ongoing urbanization, the problem of pluvial flooding will most certainly require more attention, which it is starting to receive. Some countries have already developed their strategies and initiatives and implemented both structural and non-structural measures, such as spatial planning, constructional measures, information systems, reducing land sealing through policies, building codes and standards, on-site improvement of retention, infiltration, evaporation, and rainfall water recycling with the use of green roofs, permeable or porous pavements, rain gardening or urban rainwater harvesting. At the same time, there are numerous research papers, studies, conferences and workshops devoted to the problem of pluvial flooding and its management carried out in an attempt to properly deal with this hazard. Keywords: urban areas; pluvial flooding; climate change; precipitation; scientific projects; water management; Europe. Floods are the most prevalent natural hazard in Europe. Between 1998 and 2009, Europe suffered over 213 major damaging floods (Bakker et al., 2013). Coastal and river floods receive the most attention as they are generally the floods that are largest and last the longest, while pluvial floods are relatively underrepresented in research (Nicklin et al., 2019), most likely due to the smaller scale of individual events (Dawson et al., 2008). The absolute record of annual flood loss of all types of floods in Europe was observed in August 2002, when the material damage exceeded €20 billion, in nominal value (Kundzewicz et al., 2012). However, there is an increasing problem of massive and intensifying flood damages in areas away from rivers. For example, in Great Britain two flood events in summer 2007 cost nearly €6 billion (Falkenhagen, 2010). Recent research has suggested that due to the frequent nature of pluvial floods, cumulative direct damage to property caused by those type of floods equals or may even exceed damage from river and coastal floods (Nicklin et al., 2019). Pluvial floods produce less damage but the frequency is higher and the cumulative damage over the years can be just as high as from fluvial flooding events (Acosta-Coll et al., 2018) or even higher (Szewrański et al., 2018a). For instance, of the 11 000 properties flooded in autumn of 2000 in the UK, 83% were outside coastal and fluvial floodplains, suggesting that flooding was caused by local pluvial events, sewer flooding or groundwater (Dawson et al., 2008). Pluvial flooding can be defined as flooding that results from overland flow and ponding before the runoff enters any watercourse, drainage system or sewer, or cannot enter it because the network is full to capacity and is usually caused by intense localized rainfall. This problem is enhanced in cities with insufficient or non-existent sewer systems (Acosta-Coll et al., 2018). Also, Falconer et al. (2009) state that it’s important not to confuse ‘pluvial flooding’ with ‘surface water flooding’. According to them, surface water flooding usually refers to combined flooding in urban areas during heavy rainfall. As such, it includes pluvial flooding, sewer flooding, flooding from small open-channels and overland flows from groundwater springs. Pluvial flooding is also different from ‘flash flooding’, which may also be associated with high-intensity rainfall but usually arises from a watercourse. Further in the text of this paper terms “urban pluvial flooding”, “inland pluvial flooding”, “pluvial flooding” ,“intra-urban system flooding”, “urban drainage flooding” and “surface water flooding” will be used interchangeably. Pluvial flooding only occurs when the rainfall rate exceeds the capacity of storm water drains to evacuate the water and the capacity of the ground to absorb water and this is usually associated with short-duration storms (of up to three hours) and with rainfalls that exceed 20 – 25 mm per hour; but it can also occur after rainfalls of smaller intensity, approximately 10 mm per hour, that happen over longer periods, especially if the ground surface is impermeable by being developed, saturated or frozen (Houston et al., 2011). However, pluvial floods depend not only on the amount and duration of precipitation but also on the hydrological characteristics of the basin, such as runoff magnitude, antecedent moisture condition, drainage area and soil type (Acosta-Coll et al., 2018). In addition, land use change, particularly urbanization, is also changing the proportion of precipitation which becomes runoff and also reduces the delay between precipitation and the runoff reaching a watercourse (Green et al., 2013). According to Li (2012) the urban storm water logging problems result from various causes, such as the uneven distribution of precipitation in time and space, inadequate urban water-logging emergency response systems, decreasing green areas and filling of waterbodies because of urbanization and insufficient capacity in the storm water drainage system without proper maintenance and upgrading.Other reasons for frequent inundation include outdated sewer-stormwater systems, greater areas of impervious urban fabric and larger urban population (Sušnik et al., 2014). Increasing urbanization often results in an expansion of impermeable areas, whereby the higher proportion of sealed soils result in an increased runoff volume and a decreased response time of a catchment, while further risk comes from urban areas expanding into flood risk areas (Swart et al., 2012). The main goals of this research are twofold: a) first, to show connection between current global climate change processes and urban pluvial flooding, and present effects of flooding in European cities, as well as what we can expect in the future; b) and secondly, to present strategies, directives and measures devised both on national and international levels, as well as scientific projects that deal with urban pluvial flood issues in order to contribute to better mitigation and adaptation actions in European cities. For our analysis we used the scientific literature in the last 10 to 12 years, as well as official documents from international institutions (such as UN, EU) or national governments.The occurance of pluvial or flash floods due to highintensity rainfall events is nothing new. However, it appears that the frequency with which they are happening, their impact on human lives, damage and disruption is increasing, very likely because of the climate change, and unfortunately it’s predicted to increase further (Falconer et al., 2009). As presented by IPCC Fifth Assessment Report (2014) on the world-wide impacts of climate change on rainfall extremes and urban drainage, it was ob-served that typical increases in rainfall intensity at small urban hydrology scales range from 10% to 60% from control periods in the recent past (typically 1961–1990) up to 2100 (Figure 1). These changes in extreme short-duration rainfall events may have significant impacts for urban drainage systems and pluvial flooding. The Danish Meteorological Institute (DMI) predicts that the intensity of the heavy downpours will rise by 20-50% by 2100, the most for the very rare events which will have great implication on how the rain will run off surfaces and on the burden on sewer systems and watercourses (Copenhagen Climate Adaptation Plan, 2009). Climate change is expected to increase the frequency and intensity of rainfall events throughout Europe (Sušnik et al., 2014), especially in the central and northern parts (“STAR-FLOOD”; https://www.starflood.eu/). Flood hazard may also rise during wetter and warmer winters, with increasingly more frequent rain and less frequent snow (“STAR-FLOOD”; https://www.starflood.eu/), while warmer atmosphere will hold higher amount of water vapor (Kundzewicz, 2015). There will be a marked increase in extremes in Europe, in particular, in heat waves, droughts, and heavy precipitation events, according to the Fifth IPCC Assessment Report (2014). Changes in extreme precipitation depend on the region, with high probability of increased extreme precipitation in Northern Europe (all seasons) and Continental Europe (except summer). This may result in more frequent and more intense floods of various types such as local, sudden floods (flash floods); extensive, longer-lasting pluvial and fluvial floods; coastal floods and snowmelt floods (Menne & Murray, 2013). With the expected changes, the drainage system built today probably won’t be able to meet the desired service levels in the future (Zhou et al., 2012).On the other hand, some authors state that climatechange impacts on future extreme precipitation, and consequently on pluvial flooding, is surrounded by large uncertainties. One of the uncertainties lies in the incomplete understanding of processes and components in the Earth’s system, resulting in large model uncertainties and thus large variations in projected change of future precipitation extremes between different models (Kaspersen et al., 2017). In addition, climate models provide an assessment of only anthropogenic impacts and usually don’t account for natural changes that will occur at the same time, while questions arise about the assumptions behind the climate models and how these assumptions influence the projections (Arnbjerg-Nielsen et al., 2013). However, the uncertainties associated with climate change should not be an argument for delaying investigating its possible impact on pluvial flooding or postponing adaptation actions. Instead, uncertainties should be accounted for while flexible and sustainable solutions should be sought, some of which will be presented in the following sections. Current risks from pluvial flooding and future projections Risks and adverse effects posed by pluvial flooding are numerous (Figure 2). The direct and indirect impacts of extreme weather include losses in economic terms, the damaging and destruction of private buildings and urban infrastructure, the loss of human lives and the degradation of safety and the deterioration of water quality (Szewrański et al., 2018a). In addition, flooding, especially as a result of intense precipitation, is the predominant cause of weather-related disruption to the transport sector (Pregnolato et al., 2017) and traffic delay and inconvenience (Zhou et al., 2012). Examples of indirect effects are also lost working hours and health impacts on affected residents, which can manifest if sewer water flows onto streets or if pluvial flood water stands stagnant (Sušnik et al., 2014). Furthermore, indirect impacts may occur beyond the location and time of a flood event, such as long-lasting trauma and stress (Szewrański et al., 2018b). On the other hand, average mortality for just drainage floods is low. More than half of the drainage events in the dataset causes one or zero fatalities (Jonkman & Vrijling, 2008). According to the European Environment Agency (2012) there are several factors that tend to increase the risk of pluvial flooding: • Old drainage infrastructure often does not keep pace with an on-going urbanization.Combined sewer systems in older areas (rainfall drains into sewers that are carrying sewage and both are transferred to sewage treatment) which are more vulnerable to excessive rainfall than a separate treatment.The existence of inadequate maintenance of the drainage channels to monitor debris and solid waste within such systems. • Inadequate discharge of excess water to the regional water system. Douglas et al. (2010) analyzed potential weak points of risk management of serious pluvial flooding in the case study of flooding in Heywood, Greater Manchester in 2004 and 2006. Here it was revealed that all agencies involved in flood risk management, and in particular planners, require more robust, and more localized data. This study has also highlighted that the general public are confused about who does what and who is responsible for pluvial flood risk management, and are not so well informed about how best to protect their properties. Also, many agencies underestimate the ongoing health and social effects of flooding. Modeling studies show that urbanization and increasing rainfall intensity will increase drainage overflow volumes that will result in more frequent and severe pluvial flooding (Miller & Hutchins, 2017). At present about 55% of the global population live in cities and by 2050 almost two thirds of the world’s population will live in urban environments (Sörensen et al., 2016). Over 80% of the population in Britain lives in urban areas while it’s predicted that population growth will reach 74.3 million by 2039 (Miller & Hutchins, 2017). A new study shows that the total urban area exposed to flooding in Europe has increased by 1000% over the past 150 years (Jongman, 2018). This means that urbanization with an increase of non-permeable surfaces and lack of natural drainage created additional flooding issues that did not previously exist and that never before there had been so many human assets that were in the way of floods like today. And according to Kazmierczak & Cavan (2011), the negative correlations between green space cover and the proportion of an area susceptible to flooding suggest that the increasing amount of sealed surfaces in an area aggravates the problem of flooding through increased runoff and reduced infiltration capacity. Furthermore, Guerreiro et al. (2017) developed a map of Europe which represents a percentage of city flooded for historical hourly rainfall for a 10-year return period (Figure 3). The growing urban population and degree of urbanization puts great pressure on the existing drainage systems, increasing the likelihood of them being overwhelmed (“Urban pluvial flooding and climate change: London (UK), Rafina (Greece) and Coimbra (Portugal)”; https://www.imperial.ac.uk/grantham/ research/resources-and-pollution/water-securityand-flood-risk/urban-flooding/). Systems currently designed for a 20-year return period of flooding, might flood with a mean recurrence interval of 5 years by the end of the century (“Flash floods and Urban flooding”; https://www.climatechangepost.com). On 7th August in 2002, an inch of rain fell in central London in 30 minutes during the evening “rush hour”, resulting in the closure of 5 mainline railway stations, and considerable disruption as London’s drainage infrastructure was too old and overloaded to cope with such events (Crichton, 2005). According to the UK statistics (“Facts About Floods in the UK”; https://rainbow-int-franchise.co.uk/flooding-statistics-uk/) the residents of around 2.4 million UK properties are at risk from fluvial and coastal flooding each year, while a further 2.8 million are susceptible to surface water – or pluvial – flooding. Kaspersen et al. (2017) in their research found that urban development in Odense and Vienna influences the extent of flooding considerably, while only marginally affecting the degree of flooding for Strasbourg and Nice. This suggests that, while further soil sealing in Odense and Vienna (and similar urban areas) should be considered very carefully, as it may substantially increase their exposure to pluvial flooding, urban development effect on pluvial flooding varies locally and should be considered with that in mind. The financial implications of pluvial flooding can be significant. It is estimated that in the Netherlands, between 1986 and 2009 the total damage from pluvial floods was €674 million (Sušnik et al., 2014). Nicklin et al. (2019) did the research and used a combination of 3D flood modelling and the WSS (Dutch ‘Waterschadeschatter’) flood damage estimation tool to assess direct flood damage from a 60 mm/1-h pluvial flood event in two urban areas: Belgrave (Leicester, United Kingdom) and Lombardijen (Rotterdam, the Netherlands). For Belgrave, direct damage was estimated at roughly €11 million, while for Lombardijen direct damage was €12.4 million. In England and Wales during summer of 2007 there were about 48 000 households and nearly 7300 businesses flooded (Menne & Murray, 2013) while insurance claims from the homes and businesses affected are approaching £3 billion and other costs amount to around £1 billion (Environment Agency, 2007). According to Bernet et al. (2017) in Switzerland, of all damage due to surface water floods and fluvial floods between 1999 and 2013, surface water floods are responsible for at least 45 % of the flood damage to buildings and 23 % of the associated direct tangible losses. Houston et al. (2011) estimated that almost 2 million people in urban areas in the UK face an annual 0.5 per cent probability (‘1 in 200-year’) of pluvial flooding. Most of the areas with lower percentage of city flooded are in the north and west coastal parts of Europe, while the higher percentages are predominately in continental and Mediterranean areas (Guerreiro et al., 2017). When talking about the Mediterranean region, major population and economic growth has taken place along its coast in the past century, which led to extension of urban settlements inside flood prone areas (Gaume et al., 2016). Lugeri et al. (2006) analyzed flood risk exposure in 13 European countries and found that Slovenia has the highest share of urban fabric built in flood prone areas - more than 70%. An estimated 3.8 million properties are thought to be at risk from pluvial flooding in England which represent around 10% of all properties, while in Scotland some 15 000 properties have been estimated to be at pluvial flood risk (Houston et al., 2011). The expected annual damages from urban flooding in the UK are estimated at £0.27 billion which compares to £0.6 - 2.1 billion for fluvial and coastal flooding and the estimate for the future is that this could be as much as £2 to 15 billion by 2080 compared to £1.5 – 20 billion for fluvial and coastal flooding (Dawson et al., 2008). Furthermore, Evans et al. (2008) in the Pitt Review estimated that the future risk from the intra-urban system flooding might rise by the 2080s to be of the same order as fluvial and coastal flood risk. Menne & Murray (2013) did the research on the floods in the European region and their health effects and found that in the period between 2005 and 2010, 16 countries were affected by pluvial floods: Bosnia and Herzegovina, Croatia, Czech Republic, Hungary, Malta, Poland, Republic of Moldova, Serbia, Slovenia, Spain, Sweden, Tajikistan, Republic of North Macedonia, Turkey, Ukraine, United Kingdom (England and Wales). And as mentioned in the previous part, with the projection for the continuous increase of heavy rain contribution to total precipitation (Santato et al., 2013) and with current urbanization and population growth, it is estimated that by 2050, 3.2 million people in urban areas in the UK could be at risk from pluvial flooding, which is an increase of 1.2 million (Houston et al., 2011). Figure 4, developed by European Environment Agency (2012), shows the projected change in the annual number of days with heavy rainfall in 2071–2100 against the reference period (1961–1990). Projections for regions south of the Alps show a decline in the number of days with extreme precipitation of up to five days and more. Most regions north of the Alps expect an increase, mostly of one to three days. In addition, this map shows the degree of mean soil sealing per urbanized areas of cities. Cities with high soil sealing and an increasing number of intensive rainfall events — in particular in north-western and northern Europe — face a higher risk of urban drainage flooding. Nevertheless, cities in areas with a decreasing number of such events but high soil sealing still face a flooding risk, just less often. Cities of high and low soil sealing can be found in all regions and do not cluster in a particular region with the exception of low sealing levels in cities in Finland, Norway, Slovenia and Sweden. Cyprus, Estonia, Greece and Luxembourg are countries with a high share of cities with elevated levels of soil sealing. Examples of pluvial flooding events across European continent Gaume et al. (2009) have compiled a comprehensive data record of flash floods for seven European hydrometeorological regions. This inventory was the first step towards an atlas of extreme flash floods in Europe while the objective was to document a minimum number of 30 floods in each region, especially the events considered as the most extreme or ‘‘top 30” flash floods which are homogeneously distributed over the selected period. However, this research didn’t include pluvial floods and there couldn’t be found any similar analysis that would focus on pluvial flooding events in Europe. Therefore, this section will provide a few examples of pluvial flooding occurrences that had significant economic and social impact on the communities in Europe affected by this hazard. In the summer of 2007, floods that struck much of the United Kingdom during June and July affected hundreds of thousands of people. This event was the most serious inland pluvial and fluvial flood ever recorded, with 13 deaths, about 7000 people rescued from floodwaters by the emergency services, and about 48 000 households and nearly 7300 businesses flooded (Menne & Murray, 2013), while the insurance claims from the homes and businesses flooded approached £3 billion (Environment Agency, 2007). The floods caused the loss of essential services, almost half a million people were without water or electricity supply, transport networks failed, a dam breach was narrowly averted, and emergency facilities and telecommunications were put out of action (Menne & Murray, 2013). During June, July, and August of 2007 a succession of depressions tracked over the UK, bringing heavy rainfall and triggering multiple flooding events (Stuart-Menteth, 2007). With 414 mm of rain, England and Wales haven’t seen a wetter May to July since records began in 1766 (Environment Agency, 2007). On 12th June 2007, a total of 98.3 mm of rain fell in one hour in East and South Belfast which resulted in both fluvial flash flooding and pluvial flooding which caused major disruption throughout Belfast with over 400 properties affected (Falconer, 2009). Two particularly large floods hit within just four weeks of each other. First, the northeast of England was badly affected following heavy rainfall on June 25th, which caused flooding in cities and towns such as Sheffield, Doncaster, Rotherham, Louth, and Kingston-upon-Hull (Figure 5). Some areas were hit again by further flooding after more severe rains on July 20th, which affected a much larger area of central England, including Oxford, Gloucester, Tewkesbury, Evesham, and Abingdon (Stuart-Menteth, 2007). According to the emergency services, that summer saw the greatest number of search and rescue missions in the country since the Second World War (Environment Agency, 2007).Just a couple of years before this event, at the end of July in 2002, another extreme case of storms affected much of the UK, especially West and central Scotland, and produced extreme amount of rainfall at several locations in localized intense heavy downpours generating surface water flooding and pluvial flooding affecting small urban watercourses, drainage systems and sewers (Falconer, 2009). The full storm began at approximately 10:30 am on 30 July 2002 and continued for a total of approximately 10 hours, it measured 75mm depth and had a maximum intensity of 94.5 mm/h which can be linked to a maximum return period of 100 years (Wilson & Spiers, 2003). According to the European Environment Agency (2012), on July 2nd 2011, Copenhagen in Denmark was hit by a huge thunderstorm after a substantially hot period. During a two hour period over, 150 mm of rain fell in the city centre. This became the biggest single rainfall in Copenhagen since measurements began in the mid-1800s. The city’s sewers were unable to handle all of the water and as a result many streets were flooded and sewers overflowed into houses, basements and onto streets thereby flooding the city (Figure 6). Insurance damages alone were estimated at €650–700 million. Damage to municipal infrastructure not covered by insurance, such as roads, amounted to €65 million. The Marmara region in north-western Turkey suffered from a series of floods during the period from 7th until 10th September in 2009, with 35 000 people affected, 32 human losses and more than $100 million of economic damage. The 24-hour rainfall amounts varied between 100 and 253 mm during the flooding period and additional factors such as land use changes, urbanization, poor drainage, and construction and settling in the flood-prone areas worsened consequences of the floods, especially in major urban areas of the region. Istanbul suffered most from floods where some suburban districts were submerged and the city’s highways were turned into rivers and transportation and communication infrastructures were damaged (Kömüşcü & Çelik, 2012). On the 18th September in 2007 an extreme rainfall event affected approximately one-third of Slovenia, causing the damage of €200 million and six casualties (Rusjan et al., 2009). In the town Železniki, the observed maximum daily amount of rainfall was nearly 200 mm, which was the highest recorded amount of precipitation since the beginning of the measurements in 1930 and it devastated the town of Železniki: three people lost their lives, while it was estimated that the flood caused nearly €100 million of damage (Markošek, 2008). In June of 2010 storms hit the south-east of France and the large amounts of heavy rain led to localized flash flooding and pluvial flooding which caused severe damage and loss of life in southern France, and a number of towns in the department of Var were affected, with hundreds of homes flooded (Moreau & Roumagnac, 2010). Torrential rainfall hit southern Italy and produced major flooding in parts of Sicily and Calabria on October 4th 2018. The urban area of Catania, Sicily was strongly hit where streets turned into rivers (Figure 7). Catania experienced intense rainfall with about 50 mm falling in only 20 minutes as the severe thunderstorm passed (“Major flash floods hit urban areas of Catania, Sicily”; http://www.severe-weather.eu/news/major-flash-floods-hit-urban-areas-of-catania-sicily/).In May and June of 2016, Germany was struck with recurring thunderstorms, with damage across Germany totaled €2.6 billion (Faust, 2018). Parts of Germany have come to a standstill after storms and torrential rain, especially in the south in May this year as well. One person died and daily life has been disrupted. Heavy rain and thunderstorms, mainly in southern and central Germany, have left rivers overflowing and streets flooded (Silk, 2019).Pluvial flooding risk management Adopted measures and strategies Measures and strategies that increase the specific response capacity of cities to flooding, according to Swart et al. (2012), can be classified into structural and non-structural measures or into grey, green and soft measures (Figure 8). The response capacity measures include spatial planning, constructional measures, risk acceptance, behavioral adaptation, information systems, technical flood protection and increasing natural water retention in catchment areas and reducing land sealing. Structural measures decrease the risk and they are mostly effective, but they usually involve management problems. On the other hand, nonstructural measures reduce vulnerability and when they are permanent they are reliable but can be socially costly while when they are temporary and less costly they become less reliable (Working Group F, 2010). These can be classified as passive and active where active non-structural measures are those that promote direct interaction with people, such as training, local management, early warning systems for people, public information, while passive measures involve policies, building codes and standards, and land use regulations (Acosta-Coll et al., 2018). Some of the adaptation measures involve the on-site improvement of retention, infiltration, evaporation, and rainfall water recycling with the use of green roofs, permeable or porous pavements, rain gardening, urban rainwater harvesting, or the application of water-absorbing geocomposites (Szewrański et al., 2018b). The problem of pluvial flooding is slowly starting to receive more attention, according to the interviews conducted by Mees et al. (2016) and numerous research papers (Candela & Aronica, 2016; Falconer et al., 2009; Szewrański et al., 2018a/b; Fritsch et al., 2016), conferences and workshops (Third Hydrology Forum, Oslo, 2016; Flash Floods and Pluvial Flooding Workshop, Calgari, 2010; 3rd European Conference on Flood Risk Management, Lyon, 2016) done on this topic. Through further examples of different projects, strategies and initiatives implemented in European countries separately or in mutual cooperation across the continent, various methods of urban pluvial flooding management will be observed. For instance, The EU Directive on the assessment and management of flood risks (pluvial floods included), often referred to as the Floods Directive, entered into force on 26th November 2007, which main aim is to reduce and manage the risks posed by floods to human health, the environment, cultural heritage and economic activity (Bakker et al., 2013). Floods Directive contains a three-stage approach: first, a preliminary flood risk assessment must be undertaken, then flood hazard maps and flood risk maps are to be prepared and in the final stage, member states must establish Flood Risk Management plans. Priest et al. (2016) did an analysis which indicates that the effect of the Flood Directive is highly variable among the six European countries they studied (Belgium, England, France, the Netherlands, Poland, and Sweden), but despite the shortcomings of the Flood Directive in directly affecting flood risk outcomes, it has had a positive influence in stimulating discussion and flood risk management planning in member states that were perhaps lagging behind. Or, another example, according to Land Use Consultants (2003) Sustainable Urban Drainage Systems, involves moving away from conventional piped systems and toward engineering solutions that mimic natural drainage processes and minimize adverse effects on the environment which may take the form of infiltration systems whereby water is soaked away into the ground or they may be attenuation systems, which release flows gradually to watercourses or sewers. Separate storm water and foul water systems can increase drainage capacity and reduce the likelihood of sewage mixing with pluvial flood water (Houston et al., 2011). As mentioned previously, there are different uncertainties that surround risk assessments for urban flooding, particularly connected to the climate change models and small-scale projections of extreme precipitation. Kaspersen & Kirsten (2017) proposed a way to address these uncertainties by using a very detailed integrated data and modelling approach, such as the DIAS Danish Integrated Assessment System tool they described in detail, which can help identify particularly vulnerable and valuable assets that climate change adaptation measures should protect. While the warning about extreme weather events in Germany is done nationwide by the German Meteorological Service, flood forecasting and warning is decentralized in Germany which poses the main challenge of handling of measured data coming from various providers and monitoring networks in individual formats (Osnabrugge et al., n.d.). German Water Association (DWA) set up different working groups with the aim to establish technical standards and provide affected interest groups with guidelines and practical advice which in the year 2013 during a heavy rainfall event with a return period of about 100 years has proven to significantly reduce flood risk and gain acceptance in public (Fritsch et al., 2016). As a further example, Hamburg has introduced a separate rain water drainage system in recent years and introduced financial penalties, if rain water is not locally drained by home owners (Schlünzen & Bohnenstengel, 2016). Projects related with pluvial flooding issues The following table represents various examples of different projects, strategies and initiatives implemented in European countries separately or in mutual cooperation across the continent that deal with and manage pluvial flooding. Good examples of pluvial flooding risk management could be found outside of Europe as well and perhaps studied further in the attempt to adapt good practices from across the world. For example, China is currently in the process of implementing a policy initiative called sponge cities to holistically tackle urban pluvial flooding while promoting sustainable urban development with reduced environmental impact. This initiative is well-grounded in scientific under .Conclusions Pluvial flooding, or flooding that is a result of intense localized rainfall that exceeds the capacity of a drainage system, is getting wider awareness in Europe. The estimates that cumulative damage from pluvial flooding over the years can be just as high as from fluvial flooding events or even higher is worrying and a cause for a concern. Risks posed by pluvial flooding are numerous, from economic losses, destruction of private buildings and urban infrastructure to the loss of human lives, decrease of water and health impacts. With climate change projections that there will be an increase in the frequency and intensity of rainfall events throughout Europe and with ongoing urbanization with its own effects, adaptive and sustainable solutions should be explored and pursued as soon as possible. The problem of pluvial flooding is most certainly starting to receive more attention and some countries have already developed their strategies for dealing with this hazard. As this review shows, there are already numerous researches, papers, studies, conferences and workshops dedicated to the problem of pluvial flooding and its management. Various project strategies and initiatives that deal with pluvial flooding risk management have been implemented in some of the European countries separately or in cooperation with one another. Some of the measures presented include spatial planning, constructional measures, risk acceptance, information systems, early warning systems for people, reducing land sealing through policies, building codes and standards and land use regulations, as well as adaptation measures such as the on-site improvement of retention, infiltration, evaporation, and rainfall water recycling with the use of green roofs, permeable or porous pavements, rain gardening or urban rainwater harvesting.
Acosta-Coll, M., Merelo, F., Peiro, M.M., & De la Hoz, E. (2018). Real-Time Early Warning System Design for Pluvial Flash Floods—A Review. Sensors, 18, 2255. DOI:10.3390/s18072255 Arnbjerg-Nielsen, K., Willems, P., Olsson, J., Beecham, S., Pathirana, A., Gregersen, I., Madsen, H., & Nguyen, V.-T.V. (2013). Impacts of Climate Change on Rainfall Extremes and Urban Drainage Systems. Water science and technology, 68(1), 16-28. DOI:10.2166/wst.2013.251 Ashley, R., Blanksby, J., Maguire, T. & Leahy, T. (2019). Frameworks for Adapting to Flood Risk: Experiences from the EU’s Flood Resilient City Project. Bakker, M.H.N., Green, C., Driessen, P., Hegger, D., Delvaux, B., van Rijswick, M., Suykens, C., Beyers, J.C, Deketelaere, K., van Doorn-Hoekveld, W., & Dieperink, C. (2013). Flood Risk Management in Europe: European flood regulation. STAR-FLOOD Consortium, Utrecht, The Netherlands. ISBN: 978- 94-91933-04-2 Bernet, D.B., Prasuhn, V., & Weingartner, R. (2017). Surface water floods in Switzerland: what insurance claim records tell us about the damage in space and time. Natural Hazards and Earth System Sciences, 17, 1659-1682. https://doi.org/10.5194/ nhess-17-1659-2017 Candela, A., & Aronica, G.T. (2016, September). Derivation of Rainfall Thresholds for Pluvial Flood Risk Warning in Urbanised Areas. XXXV Convegno Nazionale di Idraulica e Costruzioni Idrauliche, 14th – 16th September 2016, Bologna, Italy. https:// doi.org/10.1051/e3sconf/20160718016 Crichton, D. (2005). Flood risk & insurance in England & Wales: are there lessons to be learned from Scotland? Technical Papers 1 Benfield Greig Hazard Research Centre Dawson, R.J., Speight, L., Hall, J.W., Djordjevic, S., Savić, D., & Leandro, J. (2008). Attribution of flood risk in urban areas. Journal of Hydroinformatics, Vol. 10, No. 4, 275-288. DOI: 10.2166/hydro.2008.054 Douglas, I., Garvin, S., Lawson, N., Richards, J., Tippett, J., & White, I. (2010). Urban pluvial flooding: a qualitative case study of cause, effect and nonstructural mitigation. Journal of Flood Risk Management, 3, 112–125. DOI:10.1111/j.1753-318X.2010.01061.x Environment Agency (2007). Review of 2007 summer floods. Environment Agency, Bristol. European Environment Agency (2012). Urban adaptation to climate change in Europe. Challenges and opportunities for cities together with supportive national and European policies. EEA Report No 2/2012. Copenhagen, Denmark. Evans, E.P., Simm, J.D., Thorne, C.R., Arnell, N.W., Ashley, R.M., Hess, T.M., Lane, S.N., Morris, J., Nicholls, R.J., Penning-Rowsell, E.C., Reynard, N.S., Saul, A.J., Tapsell, S.M., Watkinson, A.R., & Wheater, H.S. (2008). An update of the Foresight Future Flooding 2004 qualitative risk analysis. Cabinet Office, London. Falconer, R. (2009, November). Pluvial Flooding and Surface Water Management. European Water Management and Implementation of the Floods Directive. 5th EWA Brussels Conference, 6th November 2009, Brussels, Belgium. Falconer, R., Cobby, D., Smyth, P., Astle, G., Dent, J., & Golding, B. (2009). Pluvial flooding: New approaches in flood warning, mapping and risk management. Journal of Flood Risk Management, 2, 198 - 208. DOI: 10.1111/j.1753-318X.2009.01034.x Falkenhagen, B. (2010, May). Flash flood and pluvial flooding from the point of view of the insurance industry. EUROPEAN COMMISSION – WFD COMMON IMPLEMENTATION STRATEGY, WG F Thematic Workshop on Implementation of the Floods Directive 2007/60/EC, “FLASH FLOODS AND PLUVIAL FLOODING”, 26th – 28th May 2010, Cagliari, Italy Faust, E. (2018). Parts of Germany under water. Available at: https://www.munichre.com/topics-online/ en/climate-change-and-natural-disasters/naturaldisasters/floods/floods-in-germany-2018.html Fritsch, K. Assmann, A., & Tyrna, B. (2016, October). Long-term experiences with pluvial flood risk management. 3rd European Conference on Flood Risk Management, E3S Web of Conferences. 7. 04017. 17th – 21st October, 2016, Lyon, France. DOI:10.1051/e3sconf/20160704017. Gaume, E., Bain, V., Bernardara, P., Newinger, O.,Barbuc, M., Bateman, A., Blaškovičová, L., Blöschl, G., Borga, M., Dumitrescu, A., Daliakopoulos, I., Garcia, J., Irimescu, A., Kohnová, S., Koutroulis, A., Marchi, L., Matreata, S., Medina, V., Preciso, E., & Viglione, A. (2009). A Collation of Data on European Flash Floods. Journal of Hydrology, 367, 70-78. DOI:10.1016/j.jhydrol.2008.12.028 Gaume, E., Borga, M., Llassat, M.C., Maouche, S., Lang, M., & Diakakis, M. (2016). Mediterranean extreme floods and flash floods.The Mediterranean Region under Climate Change. A Scientific Update, IRD Editions, 133-144, Coll. Green, C., Dieperink, C., Ek, K., Hegger, D.L.T., Pettersson, M., Priest, S., & Tapsell, S. (2013). Flood Risk Management in Europe: the flood problem and interventions (report no D1.1.1), STAR-FLOOD.
Mostafa Sameer added a reply::
Flood forecasting in Europe is a scientifically strong but changing field given increasing focus on pluvial flood issues. Relying on real-time monitoring (river gauges, rainfall radars, satellite data), hydrological models (simulating river responses to rainfall and snowmelt), meteorological forecasts (numerical weather prediction models from ECMWF, DWD, and Météo-France), and early warning systems like the European Flood Awareness System (EFAS), European flood forecasting systems for rivers combine meteorology, hydrology, and sophisticated computational modeling. These technologies allow authorities to plan for possible floods days in advance by providing probabilistic projections. Uncertainties in rainfall predictions—where small mistakes may greatly affect flood volume and timing—as well as complicated interactions from urbanization, soil saturation, and abrupt snowmelt make absolute precision still elusive. Especially hard to forecast is pluvial flooding, which results from short-duration, high-intensity rain exceeding drainage systems before flowing into rivers. Still, predicting accuracy is being continuously improved by advances in artificial intelligence, high-resolution modelling, and better data assimilation including satellite observations and IoT sensors. Driven by climate change-induced excessive rainfall, urbanization (which raises impermeable surfaces), and aging drainage infrastructure, pluvial flooding presents a growing danger. Europe is implementing structural and non-structural policies to reduce these risks, including green infrastructure (permeable pavements, rain gardens, green roofs), better urban planning (limiting development in flood-prone areas), real-time flash flood alerts, and regulatory frameworks like the EU Floods Directive (2007/60/EC), which requires flood risk assessments and management plans. Although no system can forecast floods with 100% precision, continuous developments in adaptive infrastructure, monitoring, and modeling are enhancing Europe's resistance to both fluvial and pluvial flood threats.
Recommended
Share
  • 1 Recommendation
📷
Relevant answer
Answer
flood forecasting for the rivers of the European continent is a scientific operation. It relies on the integration of various scientific disciplines and technologies to predict the occurrence, magnitude, timing, and duration of flood events.
Here's a breakdown of why it's scientific:
  • Data Collection: A vast amount of data is collected using scientific instruments and methods. This includes: Meteorological data: Rainfall measurements (from rain gauges and radar), temperature, snowpack levels, and weather forecasts (from numerical weather prediction models).Hydrological data: River water levels and discharge (flow rates) from stream gauges, soil moisture content, and groundwater levels. Satellite data: Providing information on precipitation, land cover, and snow extent, especially in remote areas.
  • Hydrological Modeling: Scientists develop and use complex computer models that simulate the movement and storage of water through the hydrological cycle. These models incorporate the collected data and physical laws to predict river flow and water levels. Different types of models exist, including: Physically-based models: These models attempt to represent the actual physical processes occurring in a river basin. Data-driven models: These models use statistical relationships and machine learning algorithms to identify patterns in historical data and predict future floods. Ensemble forecasting: Running multiple model simulations with slightly different initial conditions or model parameters to account for uncertainty in the predictions.
  • Integration of Forecasts: Flood forecasting systems often combine meteorological forecasts with hydrological models to extend the lead time of predictions. This allows for earlier warnings and better preparedness.
  • Validation and Improvement: The accuracy of flood forecasts is constantly evaluated by comparing predictions with actual flood events. This allows scientists to identify model limitations and improve forecasting techniques over time.
Can a human mechanism be created to accurately predict floods?
While significant advancements have been made in flood forecasting, achieving perfect accuracy remains a challenge. Here's why:
  • Complexity of Natural Systems: River basins are complex systems influenced by numerous interacting factors. Predicting the precise timing and amount of rainfall, snowmelt, and their subsequent impact on river flow is inherently difficult due to the chaotic nature of weather and hydrological processes.
  • Uncertainty in Weather Forecasts: Hydrological models heavily rely on weather forecasts as input. Inaccuracies in weather predictions directly translate to uncertainties in flood forecasts. The predictability of weather decreases further into the future.
  • Data Limitations: The availability and quality of real-time hydrological and meteorological data can vary across regions. Gaps or inaccuracies in data can limit the performance of forecasting models.
  • Human Factors: Human activities, such as urbanization, deforestation, and the construction of dams and levees, can significantly alter the natural flow of water and make flood prediction more complex.
  • Flash Floods: Predicting flash floods, which occur rapidly and with little warning in small catchments, is particularly challenging due to their localized nature and short timescales.
However, significant progress has been made in creating sophisticated "human mechanisms" (i.e., technological and scientific systems) for flood forecasting. These systems, like the European Flood Awareness System (EFAS), integrate data, models, and expert knowledge to provide increasingly accurate and timely flood information.
The focus is not on achieving 100% accuracy but on:
  • Improving the lead time of forecasts: Providing more time for preparedness.
  • Increasing the reliability and accuracy of predictions: Reducing false alarms and missed events.
  • Quantifying uncertainty: Communicating the range of possible flood scenarios.
  • Developing more sophisticated models: Incorporating advanced techniques like machine learning and high-resolution data.
  • Enhancing communication and dissemination of warnings: Ensuring that at-risk populations receive timely and understandable information.
In conclusion, flood forecasting is a robust scientific operation that plays a crucial role in mitigating the impacts of floods. While perfect accuracy is unattainable due to the inherent complexity of natural systems, continuous scientific advancements are leading to increasingly reliable and useful flood prediction capabilities.
  • asked a question related to Robustness
Question
1 answer
Is flood forecasting for the rivers of the European continent a scientific operation? Can a human mechanism be created to accurately predict floods in the European continent?
Pluvial flooding is a result of overland flow and ponding before the runoff enters any watercourse, drainage system or sewer, or cannot enter it because the network is full to capacity, usually caused by intense rainfall. River and coastal floods get the most attention since they are largest and last the longest, while pluvial floods are relatively marginalized in research. Therefore, the main goal of this research was to show risk posed by pluvial floods, their connection to current global climate change processes, present effects of flooding in European cities, as well as what we can expect in the future. Furthermore, the aims were to present and get more familiar with scientific projects, strategies, directives and measures devised both on national and international levels, that deal with urban pluvial flood issues across the European continent. Climate change projections indicate that there will be an increase in the frequency and intensity of rainfall events throughout Europe and along with ongoing urbanization, the problem of pluvial flooding will most certainly require more attention, which it is starting to receive. Some countries have already developed their strategies and initiatives and implemented both structural and non-structural measures, such as spatial planning, constructional measures, information systems, reducing land sealing through policies, building codes and standards, on-site improvement of retention, infiltration, evaporation, and rainfall water recycling with the use of green roofs, permeable or porous pavements, rain gardening or urban rainwater harvesting. At the same time, there are numerous research papers, studies, conferences and workshops devoted to the problem of pluvial flooding and its management carried out in an attempt to properly deal with this hazard. Keywords: urban areas; pluvial flooding; climate change; precipitation; scientific projects; water management; Europe. Floods are the most prevalent natural hazard in Europe. Between 1998 and 2009, Europe suffered over 213 major damaging floods (Bakker et al., 2013). Coastal and river floods receive the most attention as they are generally the floods that are largest and last the longest, while pluvial floods are relatively underrepresented in research (Nicklin et al., 2019), most likely due to the smaller scale of individual events (Dawson et al., 2008). The absolute record of annual flood loss of all types of floods in Europe was observed in August 2002, when the material damage exceeded €20 billion, in nominal value (Kundzewicz et al., 2012). However, there is an increasing problem of massive and intensifying flood damages in areas away from rivers. For example, in Great Britain two flood events in summer 2007 cost nearly €6 billion (Falkenhagen, 2010). Recent research has suggested that due to the frequent nature of pluvial floods, cumulative direct damage to property caused by those type of floods equals or may even exceed damage from river and coastal floods (Nicklin et al., 2019). Pluvial floods produce less damage but the frequency is higher and the cumulative damage over the years can be just as high as from fluvial flooding events (Acosta-Coll et al., 2018) or even higher (Szewrański et al., 2018a). For instance, of the 11 000 properties flooded in autumn of 2000 in the UK, 83% were outside coastal and fluvial floodplains, suggesting that flooding was caused by local pluvial events, sewer flooding or groundwater (Dawson et al., 2008). Pluvial flooding can be defined as flooding that results from overland flow and ponding before the runoff enters any watercourse, drainage system or sewer, or cannot enter it because the network is full to capacity and is usually caused by intense localized rainfall. This problem is enhanced in cities with insufficient or non-existent sewer systems (Acosta-Coll et al., 2018). Also, Falconer et al. (2009) state that it’s important not to confuse ‘pluvial flooding’ with ‘surface water flooding’. According to them, surface water flooding usually refers to combined flooding in urban areas during heavy rainfall. As such, it includes pluvial flooding, sewer flooding, flooding from small open-channels and overland flows from groundwater springs. Pluvial flooding is also different from ‘flash flooding’, which may also be associated with high-intensity rainfall but usually arises from a watercourse. Further in the text of this paper terms “urban pluvial flooding”, “inland pluvial flooding”, “pluvial flooding” ,“intra-urban system flooding”, “urban drainage flooding” and “surface water flooding” will be used interchangeably. Pluvial flooding only occurs when the rainfall rate exceeds the capacity of storm water drains to evacuate the water and the capacity of the ground to absorb water and this is usually associated with short-duration storms (of up to three hours) and with rainfalls that exceed 20 – 25 mm per hour; but it can also occur after rainfalls of smaller intensity, approximately 10 mm per hour, that happen over longer periods, especially if the ground surface is impermeable by being developed, saturated or frozen (Houston et al., 2011). However, pluvial floods depend not only on the amount and duration of precipitation but also on the hydrological characteristics of the basin, such as runoff magnitude, antecedent moisture condition, drainage area and soil type (Acosta-Coll et al., 2018). In addition, land use change, particularly urbanization, is also changing the proportion of precipitation which becomes runoff and also reduces the delay between precipitation and the runoff reaching a watercourse (Green et al., 2013). According to Li (2012) the urban storm water logging problems result from various causes, such as the uneven distribution of precipitation in time and space, inadequate urban water-logging emergency response systems, decreasing green areas and filling of waterbodies because of urbanization and insufficient capacity in the storm water drainage system without proper maintenance and upgrading.Other reasons for frequent inundation include outdated sewer-stormwater systems, greater areas of impervious urban fabric and larger urban population (Sušnik et al., 2014). Increasing urbanization often results in an expansion of impermeable areas, whereby the higher proportion of sealed soils result in an increased runoff volume and a decreased response time of a catchment, while further risk comes from urban areas expanding into flood risk areas (Swart et al., 2012). The main goals of this research are twofold: a) first, to show connection between current global climate change processes and urban pluvial flooding, and present effects of flooding in European cities, as well as what we can expect in the future; b) and secondly, to present strategies, directives and measures devised both on national and international levels, as well as scientific projects that deal with urban pluvial flood issues in order to contribute to better mitigation and adaptation actions in European cities. For our analysis we used the scientific literature in the last 10 to 12 years, as well as official documents from international institutions (such as UN, EU) or national governments.The occurance of pluvial or flash floods due to highintensity rainfall events is nothing new. However, it appears that the frequency with which they are happening, their impact on human lives, damage and disruption is increasing, very likely because of the climate change, and unfortunately it’s predicted to increase further (Falconer et al., 2009). As presented by IPCC Fifth Assessment Report (2014) on the world-wide impacts of climate change on rainfall extremes and urban drainage, it was ob-served that typical increases in rainfall intensity at small urban hydrology scales range from 10% to 60% from control periods in the recent past (typically 1961–1990) up to 2100 (Figure 1). These changes in extreme short-duration rainfall events may have significant impacts for urban drainage systems and pluvial flooding. The Danish Meteorological Institute (DMI) predicts that the intensity of the heavy downpours will rise by 20-50% by 2100, the most for the very rare events which will have great implication on how the rain will run off surfaces and on the burden on sewer systems and watercourses (Copenhagen Climate Adaptation Plan, 2009). Climate change is expected to increase the frequency and intensity of rainfall events throughout Europe (Sušnik et al., 2014), especially in the central and northern parts (“STAR-FLOOD”; https://www.starflood.eu/). Flood hazard may also rise during wetter and warmer winters, with increasingly more frequent rain and less frequent snow (“STAR-FLOOD”; https://www.starflood.eu/), while warmer atmosphere will hold higher amount of water vapor (Kundzewicz, 2015). There will be a marked increase in extremes in Europe, in particular, in heat waves, droughts, and heavy precipitation events, according to the Fifth IPCC Assessment Report (2014). Changes in extreme precipitation depend on the region, with high probability of increased extreme precipitation in Northern Europe (all seasons) and Continental Europe (except summer). This may result in more frequent and more intense floods of various types such as local, sudden floods (flash floods); extensive, longer-lasting pluvial and fluvial floods; coastal floods and snowmelt floods (Menne & Murray, 2013). With the expected changes, the drainage system built today probably won’t be able to meet the desired service levels in the future (Zhou et al., 2012).On the other hand, some authors state that climatechange impacts on future extreme precipitation, and consequently on pluvial flooding, is surrounded by large uncertainties. One of the uncertainties lies in the incomplete understanding of processes and components in the Earth’s system, resulting in large model uncertainties and thus large variations in projected change of future precipitation extremes between different models (Kaspersen et al., 2017). In addition, climate models provide an assessment of only anthropogenic impacts and usually don’t account for natural changes that will occur at the same time, while questions arise about the assumptions behind the climate models and how these assumptions influence the projections (Arnbjerg-Nielsen et al., 2013). However, the uncertainties associated with climate change should not be an argument for delaying investigating its possible impact on pluvial flooding or postponing adaptation actions. Instead, uncertainties should be accounted for while flexible and sustainable solutions should be sought, some of which will be presented in the following sections. Current risks from pluvial flooding and future projections Risks and adverse effects posed by pluvial flooding are numerous (Figure 2). The direct and indirect impacts of extreme weather include losses in economic terms, the damaging and destruction of private buildings and urban infrastructure, the loss of human lives and the degradation of safety and the deterioration of water quality (Szewrański et al., 2018a). In addition, flooding, especially as a result of intense precipitation, is the predominant cause of weather-related disruption to the transport sector (Pregnolato et al., 2017) and traffic delay and inconvenience (Zhou et al., 2012). Examples of indirect effects are also lost working hours and health impacts on affected residents, which can manifest if sewer water flows onto streets or if pluvial flood water stands stagnant (Sušnik et al., 2014). Furthermore, indirect impacts may occur beyond the location and time of a flood event, such as long-lasting trauma and stress (Szewrański et al., 2018b). On the other hand, average mortality for just drainage floods is low. More than half of the drainage events in the dataset causes one or zero fatalities (Jonkman & Vrijling, 2008). According to the European Environment Agency (2012) there are several factors that tend to increase the risk of pluvial flooding: • Old drainage infrastructure often does not keep pace with an on-going urbanization.Combined sewer systems in older areas (rainfall drains into sewers that are carrying sewage and both are transferred to sewage treatment) which are more vulnerable to excessive rainfall than a separate treatment.The existence of inadequate maintenance of the drainage channels to monitor debris and solid waste within such systems. • Inadequate discharge of excess water to the regional water system. Douglas et al. (2010) analyzed potential weak points of risk management of serious pluvial flooding in the case study of flooding in Heywood, Greater Manchester in 2004 and 2006. Here it was revealed that all agencies involved in flood risk management, and in particular planners, require more robust, and more localized data. This study has also highlighted that the general public are confused about who does what and who is responsible for pluvial flood risk management, and are not so well informed about how best to protect their properties. Also, many agencies underestimate the ongoing health and social effects of flooding. Modeling studies show that urbanization and increasing rainfall intensity will increase drainage overflow volumes that will result in more frequent and severe pluvial flooding (Miller & Hutchins, 2017). At present about 55% of the global population live in cities and by 2050 almost two thirds of the world’s population will live in urban environments (Sörensen et al., 2016). Over 80% of the population in Britain lives in urban areas while it’s predicted that population growth will reach 74.3 million by 2039 (Miller & Hutchins, 2017). A new study shows that the total urban area exposed to flooding in Europe has increased by 1000% over the past 150 years (Jongman, 2018). This means that urbanization with an increase of non-permeable surfaces and lack of natural drainage created additional flooding issues that did not previously exist and that never before there had been so many human assets that were in the way of floods like today. And according to Kazmierczak & Cavan (2011), the negative correlations between green space cover and the proportion of an area susceptible to flooding suggest that the increasing amount of sealed surfaces in an area aggravates the problem of flooding through increased runoff and reduced infiltration capacity. Furthermore, Guerreiro et al. (2017) developed a map of Europe which represents a percentage of city flooded for historical hourly rainfall for a 10-year return period (Figure 3). The growing urban population and degree of urbanization puts great pressure on the existing drainage systems, increasing the likelihood of them being overwhelmed (“Urban pluvial flooding and climate change: London (UK), Rafina (Greece) and Coimbra (Portugal)”; https://www.imperial.ac.uk/grantham/ research/resources-and-pollution/water-securityand-flood-risk/urban-flooding/). Systems currently designed for a 20-year return period of flooding, might flood with a mean recurrence interval of 5 years by the end of the century (“Flash floods and Urban flooding”; https://www.climatechangepost.com). On 7th August in 2002, an inch of rain fell in central London in 30 minutes during the evening “rush hour”, resulting in the closure of 5 mainline railway stations, and considerable disruption as London’s drainage infrastructure was too old and overloaded to cope with such events (Crichton, 2005). According to the UK statistics (“Facts About Floods in the UK”; https://rainbow-int-franchise.co.uk/flooding-statistics-uk/) the residents of around 2.4 million UK properties are at risk from fluvial and coastal flooding each year, while a further 2.8 million are susceptible to surface water – or pluvial – flooding. Kaspersen et al. (2017) in their research found that urban development in Odense and Vienna influences the extent of flooding considerably, while only marginally affecting the degree of flooding for Strasbourg and Nice. This suggests that, while further soil sealing in Odense and Vienna (and similar urban areas) should be considered very carefully, as it may substantially increase their exposure to pluvial flooding, urban development effect on pluvial flooding varies locally and should be considered with that in mind. The financial implications of pluvial flooding can be significant. It is estimated that in the Netherlands, between 1986 and 2009 the total damage from pluvial floods was €674 million (Sušnik et al., 2014). Nicklin et al. (2019) did the research and used a combination of 3D flood modelling and the WSS (Dutch ‘Waterschadeschatter’) flood damage estimation tool to assess direct flood damage from a 60 mm/1-h pluvial flood event in two urban areas: Belgrave (Leicester, United Kingdom) and Lombardijen (Rotterdam, the Netherlands). For Belgrave, direct damage was estimated at roughly €11 million, while for Lombardijen direct damage was €12.4 million. In England and Wales during summer of 2007 there were about 48 000 households and nearly 7300 businesses flooded (Menne & Murray, 2013) while insurance claims from the homes and businesses affected are approaching £3 billion and other costs amount to around £1 billion (Environment Agency, 2007). According to Bernet et al. (2017) in Switzerland, of all damage due to surface water floods and fluvial floods between 1999 and 2013, surface water floods are responsible for at least 45 % of the flood damage to buildings and 23 % of the associated direct tangible losses. Houston et al. (2011) estimated that almost 2 million people in urban areas in the UK face an annual 0.5 per cent probability (‘1 in 200-year’) of pluvial flooding. Most of the areas with lower percentage of city flooded are in the north and west coastal parts of Europe, while the higher percentages are predominately in continental and Mediterranean areas (Guerreiro et al., 2017). When talking about the Mediterranean region, major population and economic growth has taken place along its coast in the past century, which led to extension of urban settlements inside flood prone areas (Gaume et al., 2016). Lugeri et al. (2006) analyzed flood risk exposure in 13 European countries and found that Slovenia has the highest share of urban fabric built in flood prone areas - more than 70%. An estimated 3.8 million properties are thought to be at risk from pluvial flooding in England which represent around 10% of all properties, while in Scotland some 15 000 properties have been estimated to be at pluvial flood risk (Houston et al., 2011). The expected annual damages from urban flooding in the UK are estimated at £0.27 billion which compares to £0.6 - 2.1 billion for fluvial and coastal flooding and the estimate for the future is that this could be as much as £2 to 15 billion by 2080 compared to £1.5 – 20 billion for fluvial and coastal flooding (Dawson et al., 2008). Furthermore, Evans et al. (2008) in the Pitt Review estimated that the future risk from the intra-urban system flooding might rise by the 2080s to be of the same order as fluvial and coastal flood risk. Menne & Murray (2013) did the research on the floods in the European region and their health effects and found that in the period between 2005 and 2010, 16 countries were affected by pluvial floods: Bosnia and Herzegovina, Croatia, Czech Republic, Hungary, Malta, Poland, Republic of Moldova, Serbia, Slovenia, Spain, Sweden, Tajikistan, Republic of North Macedonia, Turkey, Ukraine, United Kingdom (England and Wales). And as mentioned in the previous part, with the projection for the continuous increase of heavy rain contribution to total precipitation (Santato et al., 2013) and with current urbanization and population growth, it is estimated that by 2050, 3.2 million people in urban areas in the UK could be at risk from pluvial flooding, which is an increase of 1.2 million (Houston et al., 2011). Figure 4, developed by European Environment Agency (2012), shows the projected change in the annual number of days with heavy rainfall in 2071–2100 against the reference period (1961–1990). Projections for regions south of the Alps show a decline in the number of days with extreme precipitation of up to five days and more. Most regions north of the Alps expect an increase, mostly of one to three days. In addition, this map shows the degree of mean soil sealing per urbanized areas of cities. Cities with high soil sealing and an increasing number of intensive rainfall events — in particular in north-western and northern Europe — face a higher risk of urban drainage flooding. Nevertheless, cities in areas with a decreasing number of such events but high soil sealing still face a flooding risk, just less often. Cities of high and low soil sealing can be found in all regions and do not cluster in a particular region with the exception of low sealing levels in cities in Finland, Norway, Slovenia and Sweden. Cyprus, Estonia, Greece and Luxembourg are countries with a high share of cities with elevated levels of soil sealing. Examples of pluvial flooding events across European continent Gaume et al. (2009) have compiled a comprehensive data record of flash floods for seven European hydrometeorological regions. This inventory was the first step towards an atlas of extreme flash floods in Europe while the objective was to document a minimum number of 30 floods in each region, especially the events considered as the most extreme or ‘‘top 30” flash floods which are homogeneously distributed over the selected period. However, this research didn’t include pluvial floods and there couldn’t be found any similar analysis that would focus on pluvial flooding events in Europe. Therefore, this section will provide a few examples of pluvial flooding occurrences that had significant economic and social impact on the communities in Europe affected by this hazard. In the summer of 2007, floods that struck much of the United Kingdom during June and July affected hundreds of thousands of people. This event was the most serious inland pluvial and fluvial flood ever recorded, with 13 deaths, about 7000 people rescued from floodwaters by the emergency services, and about 48 000 households and nearly 7300 businesses flooded (Menne & Murray, 2013), while the insurance claims from the homes and businesses flooded approached £3 billion (Environment Agency, 2007). The floods caused the loss of essential services, almost half a million people were without water or electricity supply, transport networks failed, a dam breach was narrowly averted, and emergency facilities and telecommunications were put out of action (Menne & Murray, 2013). During June, July, and August of 2007 a succession of depressions tracked over the UK, bringing heavy rainfall and triggering multiple flooding events (Stuart-Menteth, 2007). With 414 mm of rain, England and Wales haven’t seen a wetter May to July since records began in 1766 (Environment Agency, 2007). On 12th June 2007, a total of 98.3 mm of rain fell in one hour in East and South Belfast which resulted in both fluvial flash flooding and pluvial flooding which caused major disruption throughout Belfast with over 400 properties affected (Falconer, 2009). Two particularly large floods hit within just four weeks of each other. First, the northeast of England was badly affected following heavy rainfall on June 25th, which caused flooding in cities and towns such as Sheffield, Doncaster, Rotherham, Louth, and Kingston-upon-Hull (Figure 5). Some areas were hit again by further flooding after more severe rains on July 20th, which affected a much larger area of central England, including Oxford, Gloucester, Tewkesbury, Evesham, and Abingdon (Stuart-Menteth, 2007). According to the emergency services, that summer saw the greatest number of search and rescue missions in the country since the Second World War (Environment Agency, 2007).Just a couple of years before this event, at the end of July in 2002, another extreme case of storms affected much of the UK, especially West and central Scotland, and produced extreme amount of rainfall at several locations in localized intense heavy downpours generating surface water flooding and pluvial flooding affecting small urban watercourses, drainage systems and sewers (Falconer, 2009). The full storm began at approximately 10:30 am on 30 July 2002 and continued for a total of approximately 10 hours, it measured 75mm depth and had a maximum intensity of 94.5 mm/h which can be linked to a maximum return period of 100 years (Wilson & Spiers, 2003). According to the European Environment Agency (2012), on July 2nd 2011, Copenhagen in Denmark was hit by a huge thunderstorm after a substantially hot period. During a two hour period over, 150 mm of rain fell in the city centre. This became the biggest single rainfall in Copenhagen since measurements began in the mid-1800s. The city’s sewers were unable to handle all of the water and as a result many streets were flooded and sewers overflowed into houses, basements and onto streets thereby flooding the city (Figure 6). Insurance damages alone were estimated at €650–700 million. Damage to municipal infrastructure not covered by insurance, such as roads, amounted to €65 million. The Marmara region in north-western Turkey suffered from a series of floods during the period from 7th until 10th September in 2009, with 35 000 people affected, 32 human losses and more than $100 million of economic damage. The 24-hour rainfall amounts varied between 100 and 253 mm during the flooding period and additional factors such as land use changes, urbanization, poor drainage, and construction and settling in the flood-prone areas worsened consequences of the floods, especially in major urban areas of the region. Istanbul suffered most from floods where some suburban districts were submerged and the city’s highways were turned into rivers and transportation and communication infrastructures were damaged (Kömüşcü & Çelik, 2012). On the 18th September in 2007 an extreme rainfall event affected approximately one-third of Slovenia, causing the damage of €200 million and six casualties (Rusjan et al., 2009). In the town Železniki, the observed maximum daily amount of rainfall was nearly 200 mm, which was the highest recorded amount of precipitation since the beginning of the measurements in 1930 and it devastated the town of Železniki: three people lost their lives, while it was estimated that the flood caused nearly €100 million of damage (Markošek, 2008). In June of 2010 storms hit the south-east of France and the large amounts of heavy rain led to localized flash flooding and pluvial flooding which caused severe damage and loss of life in southern France, and a number of towns in the department of Var were affected, with hundreds of homes flooded (Moreau & Roumagnac, 2010). Torrential rainfall hit southern Italy and produced major flooding in parts of Sicily and Calabria on October 4th 2018. The urban area of Catania, Sicily was strongly hit where streets turned into rivers (Figure 7). Catania experienced intense rainfall with about 50 mm falling in only 20 minutes as the severe thunderstorm passed (“Major flash floods hit urban areas of Catania, Sicily”; http://www.severe-weather.eu/news/major-flash-floods-hit-urban-areas-of-catania-sicily/).In May and June of 2016, Germany was struck with recurring thunderstorms, with damage across Germany totaled €2.6 billion (Faust, 2018). Parts of Germany have come to a standstill after storms and torrential rain, especially in the south in May this year as well. One person died and daily life has been disrupted. Heavy rain and thunderstorms, mainly in southern and central Germany, have left rivers overflowing and streets flooded (Silk, 2019).Pluvial flooding risk management Adopted measures and strategies Measures and strategies that increase the specific response capacity of cities to flooding, according to Swart et al. (2012), can be classified into structural and non-structural measures or into grey, green and soft measures (Figure 8). The response capacity measures include spatial planning, constructional measures, risk acceptance, behavioral adaptation, information systems, technical flood protection and increasing natural water retention in catchment areas and reducing land sealing. Structural measures decrease the risk and they are mostly effective, but they usually involve management problems. On the other hand, nonstructural measures reduce vulnerability and when they are permanent they are reliable but can be socially costly while when they are temporary and less costly they become less reliable (Working Group F, 2010). These can be classified as passive and active where active non-structural measures are those that promote direct interaction with people, such as training, local management, early warning systems for people, public information, while passive measures involve policies, building codes and standards, and land use regulations (Acosta-Coll et al., 2018). Some of the adaptation measures involve the on-site improvement of retention, infiltration, evaporation, and rainfall water recycling with the use of green roofs, permeable or porous pavements, rain gardening, urban rainwater harvesting, or the application of water-absorbing geocomposites (Szewrański et al., 2018b). The problem of pluvial flooding is slowly starting to receive more attention, according to the interviews conducted by Mees et al. (2016) and numerous research papers (Candela & Aronica, 2016; Falconer et al., 2009; Szewrański et al., 2018a/b; Fritsch et al., 2016), conferences and workshops (Third Hydrology Forum, Oslo, 2016; Flash Floods and Pluvial Flooding Workshop, Calgari, 2010; 3rd European Conference on Flood Risk Management, Lyon, 2016) done on this topic. Through further examples of different projects, strategies and initiatives implemented in European countries separately or in mutual cooperation across the continent, various methods of urban pluvial flooding management will be observed. For instance, The EU Directive on the assessment and management of flood risks (pluvial floods included), often referred to as the Floods Directive, entered into force on 26th November 2007, which main aim is to reduce and manage the risks posed by floods to human health, the environment, cultural heritage and economic activity (Bakker et al., 2013). Floods Directive contains a three-stage approach: first, a preliminary flood risk assessment must be undertaken, then flood hazard maps and flood risk maps are to be prepared and in the final stage, member states must establish Flood Risk Management plans. Priest et al. (2016) did an analysis which indicates that the effect of the Flood Directive is highly variable among the six European countries they studied (Belgium, England, France, the Netherlands, Poland, and Sweden), but despite the shortcomings of the Flood Directive in directly affecting flood risk outcomes, it has had a positive influence in stimulating discussion and flood risk management planning in member states that were perhaps lagging behind. Or, another example, according to Land Use Consultants (2003) Sustainable Urban Drainage Systems, involves moving away from conventional piped systems and toward engineering solutions that mimic natural drainage processes and minimize adverse effects on the environment which may take the form of infiltration systems whereby water is soaked away into the ground or they may be attenuation systems, which release flows gradually to watercourses or sewers. Separate storm water and foul water systems can increase drainage capacity and reduce the likelihood of sewage mixing with pluvial flood water (Houston et al., 2011). As mentioned previously, there are different uncertainties that surround risk assessments for urban flooding, particularly connected to the climate change models and small-scale projections of extreme precipitation. Kaspersen & Kirsten (2017) proposed a way to address these uncertainties by using a very detailed integrated data and modelling approach, such as the DIAS Danish Integrated Assessment System tool they described in detail, which can help identify particularly vulnerable and valuable assets that climate change adaptation measures should protect. While the warning about extreme weather events in Germany is done nationwide by the German Meteorological Service, flood forecasting and warning is decentralized in Germany which poses the main challenge of handling of measured data coming from various providers and monitoring networks in individual formats (Osnabrugge et al., n.d.). German Water Association (DWA) set up different working groups with the aim to establish technical standards and provide affected interest groups with guidelines and practical advice which in the year 2013 during a heavy rainfall event with a return period of about 100 years has proven to significantly reduce flood risk and gain acceptance in public (Fritsch et al., 2016). As a further example, Hamburg has introduced a separate rain water drainage system in recent years and introduced financial penalties, if rain water is not locally drained by home owners (Schlünzen & Bohnenstengel, 2016). Projects related with pluvial flooding issues The following table represents various examples of different projects, strategies and initiatives implemented in European countries separately or in mutual cooperation across the continent that deal with and manage pluvial flooding. Good examples of pluvial flooding risk management could be found outside of Europe as well and perhaps studied further in the attempt to adapt good practices from across the world. For example, China is currently in the process of implementing a policy initiative called sponge cities to holistically tackle urban pluvial flooding while promoting sustainable urban development with reduced environmental impact. This initiative is well-grounded in scientific under .Conclusions Pluvial flooding, or flooding that is a result of intense localized rainfall that exceeds the capacity of a drainage system, is getting wider awareness in Europe. The estimates that cumulative damage from pluvial flooding over the years can be just as high as from fluvial flooding events or even higher is worrying and a cause for a concern. Risks posed by pluvial flooding are numerous, from economic losses, destruction of private buildings and urban infrastructure to the loss of human lives, decrease of water and health impacts. With climate change projections that there will be an increase in the frequency and intensity of rainfall events throughout Europe and with ongoing urbanization with its own effects, adaptive and sustainable solutions should be explored and pursued as soon as possible. The problem of pluvial flooding is most certainly starting to receive more attention and some countries have already developed their strategies for dealing with this hazard. As this review shows, there are already numerous researches, papers, studies, conferences and workshops dedicated to the problem of pluvial flooding and its management. Various project strategies and initiatives that deal with pluvial flooding risk management have been implemented in some of the European countries separately or in cooperation with one another. Some of the measures presented include spatial planning, constructional measures, risk acceptance, information systems, early warning systems for people, reducing land sealing through policies, building codes and standards and land use regulations, as well as adaptation measures such as the on-site improvement of retention, infiltration, evaporation, and rainfall water recycling with the use of green roofs, permeable or porous pavements, rain gardening or urban rainwater harvesting.
Acosta-Coll, M., Merelo, F., Peiro, M.M., & De la Hoz, E. (2018). Real-Time Early Warning System Design for Pluvial Flash Floods—A Review. Sensors, 18, 2255. DOI:10.3390/s18072255 Arnbjerg-Nielsen, K., Willems, P., Olsson, J., Beecham, S., Pathirana, A., Gregersen, I., Madsen, H., & Nguyen, V.-T.V. (2013). Impacts of Climate Change on Rainfall Extremes and Urban Drainage Systems. Water science and technology, 68(1), 16-28. DOI:10.2166/wst.2013.251 Ashley, R., Blanksby, J., Maguire, T. & Leahy, T. (2019). Frameworks for Adapting to Flood Risk: Experiences from the EU’s Flood Resilient City Project. Bakker, M.H.N., Green, C., Driessen, P., Hegger, D., Delvaux, B., van Rijswick, M., Suykens, C., Beyers, J.C, Deketelaere, K., van Doorn-Hoekveld, W., & Dieperink, C. (2013). Flood Risk Management in Europe: European flood regulation. STAR-FLOOD Consortium, Utrecht, The Netherlands. ISBN: 978- 94-91933-04-2 Bernet, D.B., Prasuhn, V., & Weingartner, R. (2017). Surface water floods in Switzerland: what insurance claim records tell us about the damage in space and time. Natural Hazards and Earth System Sciences, 17, 1659-1682. https://doi.org/10.5194/ nhess-17-1659-2017 Candela, A., & Aronica, G.T. (2016, September). Derivation of Rainfall Thresholds for Pluvial Flood Risk Warning in Urbanised Areas. XXXV Convegno Nazionale di Idraulica e Costruzioni Idrauliche, 14th – 16th September 2016, Bologna, Italy. https:// doi.org/10.1051/e3sconf/20160718016 Crichton, D. (2005). Flood risk & insurance in England & Wales: are there lessons to be learned from Scotland? Technical Papers 1 Benfield Greig Hazard Research Centre Dawson, R.J., Speight, L., Hall, J.W., Djordjevic, S., Savić, D., & Leandro, J. (2008). Attribution of flood risk in urban areas. Journal of Hydroinformatics, Vol. 10, No. 4, 275-288. DOI: 10.2166/hydro.2008.054 Douglas, I., Garvin, S., Lawson, N., Richards, J., Tippett, J., & White, I. (2010). Urban pluvial flooding: a qualitative case study of cause, effect and nonstructural mitigation. Journal of Flood Risk Management, 3, 112–125. DOI:10.1111/j.1753-318X.2010.01061.x Environment Agency (2007). Review of 2007 summer floods. Environment Agency, Bristol. European Environment Agency (2012). Urban adaptation to climate change in Europe. Challenges and opportunities for cities together with supportive national and European policies. EEA Report No 2/2012. Copenhagen, Denmark. Evans, E.P., Simm, J.D., Thorne, C.R., Arnell, N.W., Ashley, R.M., Hess, T.M., Lane, S.N., Morris, J., Nicholls, R.J., Penning-Rowsell, E.C., Reynard, N.S., Saul, A.J., Tapsell, S.M., Watkinson, A.R., & Wheater, H.S. (2008). An update of the Foresight Future Flooding 2004 qualitative risk analysis. Cabinet Office, London. Falconer, R. (2009, November). Pluvial Flooding and Surface Water Management. European Water Management and Implementation of the Floods Directive. 5th EWA Brussels Conference, 6th November 2009, Brussels, Belgium. Falconer, R., Cobby, D., Smyth, P., Astle, G., Dent, J., & Golding, B. (2009). Pluvial flooding: New approaches in flood warning, mapping and risk management. Journal of Flood Risk Management, 2, 198 - 208. DOI: 10.1111/j.1753-318X.2009.01034.x Falkenhagen, B. (2010, May). Flash flood and pluvial flooding from the point of view of the insurance industry. EUROPEAN COMMISSION – WFD COMMON IMPLEMENTATION STRATEGY, WG F Thematic Workshop on Implementation of the Floods Directive 2007/60/EC, “FLASH FLOODS AND PLUVIAL FLOODING”, 26th – 28th May 2010, Cagliari, Italy Faust, E. (2018). Parts of Germany under water. Available at: https://www.munichre.com/topics-online/ en/climate-change-and-natural-disasters/naturaldisasters/floods/floods-in-germany-2018.html Fritsch, K. Assmann, A., & Tyrna, B. (2016, October). Long-term experiences with pluvial flood risk management. 3rd European Conference on Flood Risk Management, E3S Web of Conferences. 7. 04017. 17th – 21st October, 2016, Lyon, France. DOI:10.1051/e3sconf/20160704017. Gaume, E., Bain, V., Bernardara, P., Newinger, O.,Barbuc, M., Bateman, A., Blaškovičová, L., Blöschl, G., Borga, M., Dumitrescu, A., Daliakopoulos, I., Garcia, J., Irimescu, A., Kohnová, S., Koutroulis, A., Marchi, L., Matreata, S., Medina, V., Preciso, E., & Viglione, A. (2009). A Collation of Data on European Flash Floods. Journal of Hydrology, 367, 70-78. DOI:10.1016/j.jhydrol.2008.12.028 Gaume, E., Borga, M., Llassat, M.C., Maouche, S., Lang, M., & Diakakis, M. (2016). Mediterranean extreme floods and flash floods.The Mediterranean Region under Climate Change. A Scientific Update, IRD Editions, 133-144, Coll. Green, C., Dieperink, C., Ek, K., Hegger, D.L.T., Pettersson, M., Priest, S., & Tapsell, S. (2013). Flood Risk Management in Europe: the flood problem and interventions (report no D1.1.1), STAR-FLOOD.
Relevant answer
Answer
Flood forecasting in Europe is a scientifically strong but changing field given increasing focus on pluvial flood issues. Relying on real-time monitoring (river gauges, rainfall radars, satellite data), hydrological models (simulating river responses to rainfall and snowmelt), meteorological forecasts (numerical weather prediction models from ECMWF, DWD, and Météo-France), and early warning systems like the European Flood Awareness System (EFAS), European flood forecasting systems for rivers combine meteorology, hydrology, and sophisticated computational modeling. These technologies allow authorities to plan for possible floods days in advance by providing probabilistic projections. Uncertainties in rainfall predictions—where small mistakes may greatly affect flood volume and timing—as well as complicated interactions from urbanization, soil saturation, and abrupt snowmelt make absolute precision still elusive. Especially hard to forecast is pluvial flooding, which results from short-duration, high-intensity rain exceeding drainage systems before flowing into rivers. Still, predicting accuracy is being continuously improved by advances in artificial intelligence, high-resolution modelling, and better data assimilation including satellite observations and IoT sensors. Driven by climate change-induced excessive rainfall, urbanization (which raises impermeable surfaces), and aging drainage infrastructure, pluvial flooding presents a growing danger. Europe is implementing structural and non-structural policies to reduce these risks, including green infrastructure (permeable pavements, rain gardens, green roofs), better urban planning (limiting development in flood-prone areas), real-time flash flood alerts, and regulatory frameworks like the EU Floods Directive (2007/60/EC), which requires flood risk assessments and management plans. Although no system can forecast floods with 100% precision, continuous developments in adaptive infrastructure, monitoring, and modeling are enhancing Europe's resistance to both fluvial and pluvial flood threats.
  • asked a question related to Robustness
Question
2 answers
The h-index and i10-index provided by Google Scholar are generally more robust and dynamic compared to those on ResearchGate, which can be less transparent and less frequently updated. Given this, why are we relying on the ResearchGate h-index?
Relevant answer
Answer
Google scholar citation (h index & i10 index) is more transparent, reliable and dynamic.
  • asked a question related to Robustness
Question
1 answer
What essential components and principles should be incorporated into a robust framework and set of guidelines for designing Local Adaptation Plans of Action (LAPAs) that effectively mainstream Climate Smart Village (CSV) approaches across diverse agro-ecological zones in India?
Relevant answer
Answer
One of the vital components is to integrate localized climate vulnerability and assessments through the use of both science and local knowledge to map threats and capacities. Furthermore, there should be measurable and actionable plans which are clearly defined with their timelines, roles and they prospective way of local monitoring. Cc: Himanshu Tiwari
  • asked a question related to Robustness
Question
5 answers
Using SmartPLS4 in data analysis. I obtained two VIFs which are > to 10 and a HTMT which is equal to 1.07.
How to justify adequately these values and which robust and relevant references can be used to persuade the reviewers?
Thanks
Relevant answer
Answer
Thank you very much for your detailed answer.
Do you have any relevant references justifying the two points?
Regards,
  • asked a question related to Robustness
Question
2 answers
I am exploring sensor fusion strategies to combine data from gas sensors and anemometers for robotic navigation. The objective is to integrate these inputs in a manner that addresses their different response rates and noise characteristics, ultimately enhancing navigation robustness across varied settings. I’m specifically seeking general guidance on efficient fusion techniques, managing synchronization challenges, and handling signal discrepancies—all while keeping the approach abstract to avoid divulging project-specific details. Any insights or references to relevant literature would be greatly appreciated.
Relevant answer
Answer
Fusing gas sensor and anemometer data can significantly enhance robotic navigation, especially in scenarios involving environmental monitoring, search and rescue, or hazardous material handling. Here’s how these sensors complement each other and their applications:
1. Applications of Gas Sensor and Anemometer Data Fusion
  • Source Localization: Pinpointing the location of a gas source by tracking gas concentration gradients and wind direction.
  • Path Optimization: Planning paths that avoid dangerous areas with high gas concentrations.
  • Environmental Monitoring: Mapping pollutant dispersion or monitoring air quality in dynamic outdoor environments.
  • Search and Rescue: Detecting chemical leaks or tracking hazardous gases in disaster zones.
2. Key Challenges
  1. Gas Dispersion Complexity: Gas concentration can vary widely due to turbulence and environmental factors like temperature and pressure.
  2. Wind Dynamics: Wind can scatter gas unpredictably, creating non-linear patterns in gas concentration.
  3. Real-Time Computation: Continuous data fusion and path planning require efficient algorithms to process sensor data in real-time.
3. Data Fusion Techniques
The combination of gas sensor and anemometer data enables more accurate navigation decisions by leveraging complementary information:
A. Signal Filtering and Preprocessing
  • Use filters (e.g., Kalman Filter) to reduce noise in gas concentration and wind speed/direction data.
  • Normalize gas concentration data to account for variations in sensor sensitivity.
B. Wind-Aware Gas Mapping
  • Construct a gas concentration map by fusing gas sensor readings with wind speed and direction from the anemometer.
  • Utilize Gaussian plume models to estimate gas dispersion in relation to wind.
C. Gradient-Based Navigation
  • Use the gas concentration gradient to guide the robot toward or away from a gas source.
  • Incorporate wind direction to predict likely gas dispersal paths and refine the navigation strategy.
D. Particle Filter Localization
  • Use particle filters to simulate potential gas source locations based on sensor data, iteratively refining estimates by considering wind patterns.
E. Reinforcement Learning
  • Train robots using Reinforcement Learning to learn navigation policies that balance exploration (seeking new information) and exploitation (following a strong signal).
F. Multisensor Fusion Algorithms
  • Combine gas and wind data using methods like: Bayesian inference to probabilistically infer gas source location. Grid-based mapping to maintain dynamic gas and wind distribution maps.
4. Example Use Case
Scenario: A robot tasked with locating a methane leak in an industrial plant.
  • Gas Sensor Role: Detects methane concentration, identifying areas of higher gas levels.
  • Anemometer Role: Measures wind direction and speed, helping predict the direction of gas dispersion.
  • Navigation Process: The robot uses the gas concentration gradient to head toward the suspected source. It adjusts its trajectory using wind data to account for gas dispersal patterns. Fused data is processed in real-time to refine path planning, enabling efficient and safe navigation.
5. Benefits of Fused Data for Robotic Navigation
FeatureGas SensorAnemometerCombined BenefitDetect Gas Presence✅❌Enhanced accuracy by corroborating gas readings with dispersion data.Determine Dispersion Direction❌✅Predictive modeling of gas movement based on wind.Locate Source✅ (via gradient)✅ (via wind direction)Faster and more precise source localization.Avoid Dangerous Zones✅✅Proactive path adjustments for safety.
6. Considerations for Implementation
  1. Sensor Placement: Position gas sensors and anemometers optimally on the robot to minimize interference.
  2. Environmental Modeling: Incorporate environmental factors like terrain, temperature, and pressure for more accurate predictions.
  3. Computational Efficiency: Optimize algorithms for real-time processing to handle dynamic environments effectively.
  • asked a question related to Robustness
Question
6 answers
Hi,
I'm having problems with silique filling in my WT plants. I'm getting robust inflorescences with thick shoots, but almost no seeds are being produced.
Has anyone had this problem before and can offer any advice?
Thank you!
Relevant answer
Answer
I will. Thank you!!
  • asked a question related to Robustness
Question
1 answer
DOLS is a parametric and FMOLS is a non-parametric approach. However, there are numbers of studies where they have done both the tests for the same dataset as robustness check.
My question is how can both the parametric and non-parametric tests can be applied on the same dataset?
Relevant answer
Yes, FMOLS (Fully Modified OLS) and DOLS (Dynamic OLS) can both be applied to the same dataset as complementary methods to check the robustness of long-run cointegration relationships. Although FMOLS is non-parametric and DOLS is parametric, both correct for endogeneity and serial correlation in different ways. Applying both helps verify the stability and reliability of the estimated coefficients. If results from both methods are consistent, it strengthens the confidence in the findings. This dual approach is common in empirical economics to ensure robustness, especially when dealing with small samples or uncertainty about model specification.
  • asked a question related to Robustness
Question
3 answers
regression models in the social sciences exact or no?
Relevant answer
Answer
Interesting last comment. If you follow all the required steps for regression modelling it is a very robust and reliable approach. It is only a statistical method though, and is only ever as good as the investigation to which is applied. I was surprised by the previous answer, so left field and so packed full of assumptions. Perhaps a misleading comment in this context. A statistical method should never be judged except in the context of the situation in which it is used. Regression modelling (read this as meaning all types of multivariate statistics using similar linear relationship assumptions) has been used with great benefits for nearly 100 years in the social sciences. I would call it a robust approach whether all the variance is captured or not. But like most tools it is only as good as the scientist using it and their knowledge of how to optimise it for the intended purpose. Endogeneity was rarely ever mentioned in my experience (The previous author may care to explain what they think this means, and provide some mathematical evidence for their claim). Typically in linear regression modelling, whatever the hypothesised source of variance (including endogeneity) might be, one can always create a dummy variable to represent it, and specify some parameters for it, and include it in the model. Clever statisticians do that all the time. More reading on the history and development of linear statistical models might help as well.
  • asked a question related to Robustness
Question
1 answer
Unfortunately, I do not have a robust system for performing molecular dynamics using Gromacs software. Please help me in this regard.
Relevant answer
Answer
  • asked a question related to Robustness
Question
1 answer
Hi everyone, I am seeking guidance on constructing an age-depth model for sedimentary deposits of Miocene age using foraminifera (Planktic and Benthic). While I am familiar with the Bayesian age modelling program Bacon, commonly employed for Holocene and recent sediments, I am uncertain of its applicability to the Miocene epoch. Could you please advise on whether Bacon is a suitable tool for this purpose? If not, I would be grateful for suggestions on alternative methodologies for age-depth modelling in Miocene sediments. Thank you.
Relevant answer
Answer
The first step is to know the srtuctural cracteristics of the region. The Miocene sequence is not folded? Is not faulted? If yes, it must modedise to a normal sequence and at list age determiation of the real thickness.
  • asked a question related to Robustness
Question
5 answers
I performed a 3-replicate qPCR experiment and calculated the fold change using 2^(-ΔΔCt).
To combine the results of the replicates into one graph, I normalised the fold change by dividing the reference condition's fold change by itself (resulting in 1) and then dividing all other conditions by the reference.
I used the geometric mean to calculate the average fold change across the replicates, as I believe the arithmetic mean might be inaccurate in this case.
Is this approach correct for analysing my data? Or is there another more robust method you would recommend for this type of analysis?
Thank you!
Relevant answer
Answer
Your approach is generally correct and follows best practices for qPCR data analysis, but let’s go through some key considerations to ensure robustness.
1. Use of 2−ΔΔCt2^{-\Delta\Delta C_t} for Fold Change Calculation
✅ This is the standard method for relative quantification in qPCR, assuming that amplification efficiencies are close to 100% (or comparable across targets).
2. Normalization to the Reference Condition
✅ Dividing all fold changes by the reference condition (making it 1) is a valid way to normalize the data for visualization, ensuring that relative changes are properly scaled.
3. Use of the Geometric Mean for Replicates
✅ The geometric mean is a better choice than the arithmetic mean when averaging fold changes because qPCR data is on a logarithmic scale. Since fold changes are multiplicative, using the geometric mean prevents overestimation or underestimation that could arise from an arithmetic mean.
Geometric Mean=(∏i=1nFold Changei)1n\text{Geometric Mean} = \left(\prod_{i=1}^{n} \text{Fold Change}_i \right)^{\frac{1}{n}}
This method correctly represents the central tendency of multiplicative data.
4. Statistical Analysis
To improve robustness, consider:
  • Log-transformation: Convert fold changes to log2 values before statistical testing (e.g., t-tests, ANOVA). This normalizes the distribution and makes variance more homogenous.
  • Error Representation: Instead of using standard deviation (which is not ideal for multiplicative data), report geometric standard deviation or use log-transformed confidence intervals.
  • Checking Efficiency: Ensure your primers and reaction conditions are optimized for near-identical amplification efficiency between target and reference genes.
Alternative Methods for Robustness
  • Mixed-effects models: If you have multiple conditions, replicates, or batch effects, using a linear mixed-effects model (e.g., with log-transformed fold changes) can provide more statistical rigor.
  • Bayesian approaches: If dealing with low expression levels, Bayesian models can account for technical variability better than traditional statistical tests.
Conclusion
Your method (geometric mean + normalization) is a solid and widely accepted approach for qPCR analysis. If you want to add statistical robustness, consider log-transformed statistical tests or mixed-effects models. Would you like guidance on specific statistical tests based on your experimental design? 😊
  • asked a question related to Robustness
Question
1 answer
In today’s rapidly evolving threat landscape, cyber risks are characterized by four critical dimensions: Velocity, Volume, Variety, and Visibility. These “4 Vs” present unique challenges, requiring organizations to adopt continuous assessment strategies that go beyond traditional, static risk evaluations.
This discussion seeks to explore how organizations can effectively implement continuous cyber risk assessment methodologies to address the dynamic nature of cyber risks while ensuring alignment with strategic business objectives.
Key questions include:
- What strategies and frameworks have proven effective in managing the 4 Vs of cyber risks?
- How can organizations enhance real-time risk visibility, prioritization and adaptability?
- What role do people, processes, and technology play in creating a robust approach to continuous cyber risk assessment?
- We invite researchers, practitioners, and cybersecurity enthusiasts to share insights, case studies, and innovative approaches to this pressing topic. Let’s collectively explore how continuous assessment can enable organizations to stay resilient in the face of ever-changing cyber threats.
Relevant answer
Answer
The 4 Vs of cyber risks refer to Volatility, Visibility, Vulnerability, and Value. Managing these risks effectively requires a combination of strategies and frameworks that address each of these aspects. Here are some effective strategies and frameworks for managing the 4 Vs of cyber risks:
1. Risk Management Frameworks: Organizations can adopt risk management frameworks such as NIST Cybersecurity Framework (CSF), ISO/IEC 27001, or COBIT to provide a structured approach to managing cyber risks. These frameworks offer guidelines and best practices for identifying, assessing, and mitigating risks.
2. Continuous Monitoring: Implementing continuous monitoring solutions can help organizations enhance real-time risk visibility. This involves monitoring network traffic, system logs, and other relevant data to detect potential security breaches or anomalies.
3. Threat Intelligence: Organizations can leverage threat intelligence feeds to stay informed about the latest threats and vulnerabilities. This information can be used to update security controls and improve incident response capabilities.
4. Vulnerability Management: Regularly scanning systems and networks for vulnerabilities, and promptly addressing identified vulnerabilities, can help reduce the attack surface and minimize the impact of potential breaches.
5. Security Automation: Automating repetitive tasks such as log analysis, threat detection, and incident response can help organizations improve their adaptability and responsiveness to cyber threats.
6. Security Governance: Establishing a strong security governance framework can help ensure that security policies, procedures, and standards are effectively implemented and maintained across the organization.
7. Employee Education and Awareness: Educating and training employees on cybersecurity best practices can help prevent human-error-induced breaches. Regularly conducting security awareness training can foster a culture of security within the organization.
8. Collaboration and Communication: Encouraging collaboration and communication between different departments, such as IT, security, and business units, can help improve risk visibility and facilitate a more comprehensive understanding of cyber risks.
9. Technology Integration: Implementing a holistic cybersecurity solution that integrates various security tools and technologies can help organizations streamline their security operations and improve their overall cybersecurity posture.
People, processes, and technology all play crucial roles in creating a robust approach to continuous cyber risk assessment. Here's how each component contributes:
1. People: Skilled and knowledgeable security professionals are essential for developing and implementing effective cybersecurity strategies. They should stay updated with the latest threats, vulnerabilities, and best practices, and be able to adapt quickly to evolving cyber threats.
2. Processes: Well-defined security processes and procedures help ensure that cybersecurity measures are consistently applied across the organization. This includes incident response plans, vulnerability management processes, and regular security audits.
3. Technology: Advanced cybersecurity technologies, such as intrusion detection systems, security information and event management (SIEM) solutions, and encryption, can help organizations detect, prevent, and respond to cyber threats more effectively.
By combining these elements, organizations can create a robust approach to continuous cyber risk assessment that enhances real-time risk visibility, prioritization, and adaptability.
  • asked a question related to Robustness
Question
4 answers
When conducting empirical research on "the dynamic relationship between economic growth and greenhouse gas emissions for oil-exporting countries". From your point of view, what is the most suitable sample of countries for generating robust and policy-relevant findings?
Relevant answer
Answer
The most suitable sample of countries for generating robust and policy-relevant findings depends on the research objectives, policy area, and the level of global representation required. A balanced sample typically includes diversity in economic development, such as high-income countries (e.g., the United States, Germany), middle-income countries (e.g., Brazil, India), and low-income countries (e.g., Mali, Nepal). Geographic representation across continents is also essential, with examples like Nigeria (Africa), China (Asia), Germany (Europe), and Brazil (South America). Additionally, including countries with varying political systems—such as democracies (e.g., Sweden), authoritarian regimes (e.g., China), and hybrid systems (e.g., Turkey)—provides insights into how governance influences policy outcomes. Cultural diversity is another critical factor, represented by countries like India (pluralistic), Saudi Arabia (Islamic), and Japan (homogeneous). Furthermore, countries should reflect different policy approaches, such as the Nordic countries for welfare models, Singapore for urban planning, and Germany for renewable energy initiatives. The sample should also include countries of varying population sizes and geographic scales, such as China (large population), Iceland (small population), Russia (large landmass), and Singapore (small landmass). Relevance to the research topic is vital; for instance, a study on climate policy should include major emitters like the U.S. and China, as well as vulnerable nations like Bangladesh. Finally, the availability of reliable and comparable data from international sources like the World Bank is crucial to ensure the validity of findings. A representative sample might include the United States, Germany, China, India, Brazil, Nigeria, Sweden, Saudi Arabia, Bangladesh, and Australia, offering a diverse foundation for global policy analysis.
  • asked a question related to Robustness
Question
1 answer
Dear all, the relation between the discrete measurement noise matrix, Rk, and the continuous measurement noise matrix,R(t), of the Kalman filter is given in the book (Optimal and Robust Estimation With an Introduction to Stochastic Control Theory, 2008, page 87) as Rk=R(t)/T, where T is the sampling time. In this equation there is unbalance in the units because the units of Rk are R(t) /unit time. Could anyone help to explain how this comes because the units of Rk should be the same as R(t).
Relevant answer
Answer
Dear Tamer,
The formula you provide for Rk is meaningless until you specify the value of t. Maybe you mean Rk = R(T)/T (where T, as you said, is the sampling time)?
Anyway, your formula seems to imply a linear growth of the noise covariance with time. This is not inherent to the Kalman filter, but to the assumptions you make on the time evolution of noise. Different stochastic models lead to different relations between R(t) and Rk. In your case, it seems that the noise is modelled as a diffusion process (a continuous-time version of a random walk, something known as a Wiener process when the differential steps are Gaussian). In this kind of processes, the error grows as the square root of time, and hence its variance (or covariance) grows linearly with time. Thus, R(t) = R1*t for some R1 (note that R1 equals R(t) at t = 1). If you now take T (your sampling time) as the time unit, you get Rk = R1. Thus R(t) = Rk*t, where t is given in multiples of T. In particular, Rk = R(T)/T. Actually Rk = R(t)/t for any t (which allows to regard Rk as the time derivative of R(t)).
If, instead of a diffusion process, you modelled noise as a stationary Gauss-Markov process, for instance, then you'd have that R(t) would not depend on time (hence R(t) = Rk regardless of the value of t). So, you see that the relation between Rk and R(t) depends on you underlying stochastic model (the fact that you're using a Kalman filter has nothing to do with it). The choice of the stochastic model depends on the physical process you're trying to model. Making the right choice for your filter is not an easy task, and is often the key to success.
By the way, the same considerations apply to the process noise of the Kalman filter, not only to the measurement noise.
Hope this helps...
  • asked a question related to Robustness
Question
1 answer
Hi everyone, I am currently working on building an econometric model to predict cost overruns in construction projects in Quebec. This model will identify various economic variables (inflation, exchange rates, labor costs, etc.) and use historical data tested over different time periods to evaluate the model’s robustness and accuracy.
Given the complexity of this topic, I am seeking advice and suggestions from the community. Here are some questions I have:
  1. What are the key economic variables that you think should be considered in this model?
  2. Are there any specific econometric techniques or models that you would recommend for this type of analysis?
  3. Do you know of any similar studies or resources that could guide me in this research?
  4. How can I ensure the robustness and accuracy of my model?
  5. What are the common pitfalls or challenges in this kind of research and how can they be avoided?
  6. Where could I find reliable sources of historical data for this type of analysis?
Any advice or guidance would be greatly appreciated. Thank you in advance for your time and assistance 😊.
Relevant answer
Answer
The first steps in a research project involve identifying a clear and focused research topic based on a gap in existing knowledge or a problem that needs solving. This is followed by a review of relevant literature to understand the current state of research and refine the research questions or hypotheses. Next, defining the objectives, scope, and significance of the study sets the foundation. Finally, selecting an appropriate methodology and creating a research plan or proposal ensures a structured and systematic approach to achieving the study's goals.
  • asked a question related to Robustness
Question
1 answer
I am developing a modular educational robotics kit and looking for more robust and practical alternatives to traditional jumpers. The goal is to ensure secure and durable electrical connections while facilitating frequent assembly and disassembly by students. Would magnetic connectors, quick-release terminals, or other solutions be viable for this context?
Relevant answer
Answer
For some of our underwater robots, we use Wago quick-release connectors. These ensure a good electrical connection and seem quite good at handling vibrations without any wires coming loose. Depending on the amount of space available, it is possible to buy mounting brackets that fit DIN rails. However, we have 3D printed some mounts to fit inside our specific water-tight control bottles. They come as either 2,3 or 5-pin connectors and can handle wires from 0.2 to 4 mm2.
  • asked a question related to Robustness
Question
1 answer
The same drivers that have heralded the rise of quantum open architecture will enable it to accelerate the advent of utility-scale quantum computing in the future. As technology progresses, the complexity and development cost of each quantum computing component grows exponentially. This provides a strong incentive for companies to down-scope and specialise vertically. Even companies that currently develop most of the quantum computing stack in-house will transition to QOA.
New Capabilities Will Expand the Scope of Quantum Open Architecture Quantum Computing
Next-generation components and systems play a pivotal role in advancing utility-scale quantum computing. Scalable quantum processors, like those developed by QuantWare, are essential for building larger, more powerful quantum systems. These processors are designed to integrate seamlessly with up-stack technologies (advanced control hardware and cryogenic systems) and down-stack (novel qubit types and application-specific designs), creating the foundation for robust quantum computing platforms.
source: Quantum Computing Horizons: The Future of Quantum Open Architecture
Relevant answer
Answer
Recently elaborated novel feasible mathematical structure, substantially generalizing and actually replacing conventional quantum mechanics (see https://soiguine.com for many details about Geometric Algebra Supercomputing, GASC,) demonstrates that the core of GASC is not entanglement, whatever it is. The entanglement-based implementations of “quantum computing” have wasted billions of dollars around the world, without any reasonable success. Contrary to that the GASC computer is a kind of analog computer simulated effectively with software on multithread GPU platforms.
The suggested simulation software runs on Snapdragon ARM platforms equipped with the Adreno GPUs. Thus, we get a PC that supports the two challenging issues of the near future: the amount of energy consuming and manufacturing of chips and tremendously outstrips existing early-stage samples of entangled-based quantum computing monsters in calculation power and effectivity.
  • asked a question related to Robustness
Question
1 answer
Sub-Research Questions
1. What are the socio-economic impacts of poor data quality on financial forecasting in developing economies?
• This question aims to explore how data quality issues specifically affect financial forecasting in developing economies, where data infrastructure may be less robust. It addresses the broader socio-economic implications, such as the impact on economic growth, investment decisions, and financial stability.
2. How can emerging technologies, such as artificial intelligence and blockchain, be leveraged to improve data quality in financial forecasting?
• This question focuses on the potential of emerging technologies to enhance data quality. It explores how AI and blockchain can be used to ensure data accuracy, integrity, and reliability in financial forecasting.
Relevant answer
Answer
Data Quality and Financial Forecasting:
Data quality is crucial for accurate and reliable financial forecasting. High-quality data ensures precise predictions, while poor data can lead to flawed decision-making, financial mismanagement, and loss of trust. Accurate data supports budgeting, investment decisions, and risk assessments, whereas poor data risks causing inaccurate forecasts and financial instability.
Sub-Research Questions:
  1. Socio-Economic Impacts in Developing Economies: In developing economies, poor data quality can hinder economic growth, deter investment, and jeopardize financial stability. Misguided policies, inefficient resource allocation, and inadequate risk assessments can result in slower development and negative socio-economic outcomes, especially for marginalized communities.
  2. Emerging Technologies to Improve Data Quality: Emerging technologies like AI and blockchain can enhance data quality in financial forecasting:AI: Automates data cleaning, improves predictive accuracy, and uses NLP for insights from unstructured data. Blockchain: Ensures data integrity with tamper-proof records, provides transparency, and improves traceability, making financial data more reliable and trustworthy for forecasting.
Together, AI and blockchain can mitigate data quality issues, ensuring more accurate and reliable financial forecasting.
  • asked a question related to Robustness
Question
1 answer
Themes ought to include:
guaranteeing accurate portrayal of participants' opinions.
both externally divergent (different from other themes) and internally consistent (coherence within the theme).
evaluated iteratively, incorporating triangulation or peer debriefing to increase dependability.
Relevant answer
Answer
If anyone has perspectives please share
  • asked a question related to Robustness
Question
1 answer
Human rights violations occur when actions by state or non-state actors infringe upon the basic rights and freedoms to which all humans are entitled, as outlined in international agreements like the Universal Declaration of Human Rights (UDHR) and various other treaties. These rights encompass civil, political, economic, social, and cultural dimensions, which are essential for dignity, freedom, and equality. Violations can take many forms, including, but not limited to:
1. Civil and Political Rights Violations
  • Arbitrary Detention and Imprisonment: Detaining individuals without fair trial or due process, often for political reasons, suppresses freedom and violates the right to a fair judicial process.
  • Torture and Inhumane Treatment: Subjecting people to physical or psychological harm, often to punish or intimidate, breaches the fundamental right to be free from cruel, inhuman, or degrading treatment.
  • Suppression of Freedom of Expression and Assembly: Restricting people's rights to express opinions, protest peacefully, or associate freely undermines democratic principles and basic civil liberties.
  • Discrimination: Denying individuals rights based on characteristics such as race, gender, ethnicity, religion, or disability violates the principle of equality and nondiscrimination.
2. Economic, Social, and Cultural Rights Violations
  • Denial of Basic Health Services: Restricting access to essential healthcare services and clean water endangers lives and violates the right to health.
  • Forced Evictions and Housing Insecurity: Forcing people out of their homes or failing to provide adequate housing affects the right to a standard of living adequate for health and well-being.
  • Child Labor and Exploitation: Engaging children in harmful work denies them their rights to education, safety, and development.
  • Educational Deprivation: Denying or restricting access to education, particularly for marginalized groups, violates the right to education and limits opportunities for future well-being.
3. Genocide, War Crimes, and Crimes Against Humanity
  • Genocide: Systematic targeting of a group based on ethnicity, religion, or nationality with intent to destroy is considered one of the gravest human rights violations.
  • War Crimes: Actions that breach the Geneva Conventions, such as targeting civilians during conflict, using prohibited weapons, or committing sexual violence, constitute war crimes.
  • Crimes Against Humanity: Large-scale attacks on civilians, such as enslavement, extermination, or persecution, are violations of fundamental human rights.
4. Environmental Degradation and Climate-Related Violations
  • Denial of Access to Safe Environments: Polluting water sources, contaminating land, and exposing communities to toxic substances infringe upon the rights to health and life.
  • Climate Change Impacts on Human Rights: Actions that contribute to climate change, leading to displacement or destruction of livelihoods, increasingly affect the rights to life, health, food, and shelter for vulnerable populations.
5. Gender-Based Violence and Discrimination
  • Violence Against Women and Girls: Gender-based violence, such as domestic abuse, sexual violence, and female genital mutilation (FGM), violates women’s rights to security and bodily autonomy.
  • Discrimination in Law and Practice: Laws or practices that deny women equal opportunities, rights to inheritance, or access to employment undermine gender equality and women’s empowerment.
Mechanisms for Addressing Human Rights Violations
International and regional bodies, such as the United Nations, the International Criminal Court (ICC), and human rights organizations, work to document, report, and advocate against human rights violations. Victims and civil society groups often rely on these organizations to seek accountability, raise awareness, and push for legislative or policy reforms. However, persistent challenges remain, especially in areas where governments or powerful groups are implicated in rights abuses.
Ending human rights violations requires robust legal frameworks, political will, international cooperation, and a strong civil society that advocates for justice, accountability, and systemic reforms that protect individuals and uphold fundamental human rights.
Relevant answer
Answer
1. Integration of Technology and AI
  • AI for Legal Analysis: Utilizing artificial intelligence (AI) for analyzing large volumes of legal documents, judgments, and treaties can identify patterns in human rights violations and judicial trends.Example: AI-powered tools like LexisNexis or Westlaw Edge can assist in comparative legal research across jurisdictions.
  • Blockchain for Data Integrity: Blockchain technology can be employed to ensure the integrity and immutability of evidence in human rights cases, especially in conflict zones.
  • Legal Tech Training: Equipping researchers and practitioners with skills to use advanced legal research platforms.
2. Interdisciplinary Approaches
  • Social Sciences and Law: Collaboration with sociologists, political scientists, and anthropologists can offer a deeper understanding of the socio-political context of human rights issues.
  • Environmental Law and Human Rights: Exploring the nexus between environmental protection and human rights, particularly with the rise of climate-related displacement and ecological justice.
  • Data Science Integration: Employing statistical and data analysis methods to map trends in human rights violations globally.
3. Focus on Comparative Legal Research
  • Harmonizing International Standards: Conducting comparative studies on the implementation of human rights conventions, such as the Universal Declaration of Human Rights (UDHR), European Convention on Human Rights (ECHR), and regional instruments like the African Charter on Human and Peoples' Rights.
  • Impact Assessments: Evaluating how different legal systems integrate human rights principles into domestic law and identifying best practices.
4. Global Collaboration and Networking
  • International Databases: Developing open-access repositories for human rights cases, treaties, and academic articles to promote equitable access to information.
  • Cross-Border Partnerships: Encouraging collaboration among universities, NGOs, and international organizations to address transnational issues like migration and refugee rights.
  • Human Rights Clinics: Establishing legal clinics in academic institutions to provide practical experience and foster collaboration on global issues.
5. Policy-Oriented Research
  • Practical Solutions: Shifting the focus from theoretical analysis to actionable recommendations that address current human rights challenges, such as online privacy or digital surveillance.
  • Legislative Reform Proposals: Drafting model laws and policies to bridge gaps in domestic legal frameworks concerning human rights obligations.
6. Incorporating Marginalized Perspectives
  • Indigenous and Minority Rights: Prioritizing research on the rights of marginalized communities and indigenous populations.
  • Gender Lens: Focusing on issues of gender justice, equality, and women's rights, including the role of international treaties like CEDAW (Convention on the Elimination of All Forms of Discrimination Against Women).
7. Practical Training and Capacity Building
  • Workshops and Seminars: Regular capacity-building initiatives for legal researchers and practitioners in emerging areas like digital rights, AI ethics, and climate justice.
  • Skill Development Programs: Encouraging legal researchers to acquire skills in advocacy, negotiation, and public speaking to translate research into real-world impact.
8. Enhanced Funding and Institutional Support
  • Grants for Innovation: Encouraging governments, international bodies, and private foundations to fund innovative legal research projects.
  • Dedicated Research Centers: Establishing institutes specializing in human rights research, such as the Raoul Wallenberg Institute of Human Rights and Humanitarian Law.
9. Monitoring and Evaluation
  • Impact Metrics: Developing metrics to assess the impact of human rights research on policy, litigation, and awareness.
  • Feedback Mechanisms: Regularly updating research methods and priorities based on feedback from practitioners, communities, and policymakers.
  • asked a question related to Robustness
Question
1 answer
Hi. I construct a multidimensional data quality indicators for a low-cost wireless sensors network. Currently, it is tested using synthetic data that reproduce data quality issues such as accuracy, timeliness, completeness and reliability. Is there any testing methods using real world datasets that can test the robustness of the indicators?
Relevant answer
Answer
The multidimensional data quality indicators that I developed for the low-cost sensors is based on the outcome of my systematic literature review in the attachment.
  • asked a question related to Robustness
Question
2 answers
Hello,
Is ML robust to non-normal distribution data or does it assume normality in Mplus?
Best,
Relevant answer
Answer
If "ML" stands for "maximum likelihood", then the answer is:
There is no need for ML to being "robust to non-normal data" (btw: it's not the distribution of the data, but rather the distribution of the variable !), because ML can use any distribution model.
The difficulty is to get p-values from the likelihood ratio, because the sampling distribution of the likelihood ratio is usually unknown. Here, the likelihood is often approximated by a scaled multivariate normal or Wishart distribution, for which the sampling distribution of the likelihood ratio statistic is known (Chi² or F). This is the point where ML assumes at least approximate normality of the variable.
The likelihood ratio can be determined from the actual values of the likelihood. However, some algorithms do not explicitly evaluate the function in the restricted parameter space and use another approximation to get an approximate value of the maximum likelihood in the restricted parameter space (by Wilks' theorem, for those who like to google). This approximation assumes that the log-likelihood around its maximum is close to a paraboloid, what again is the case when the variable is approximately normal distributed. So in this case the normal assumption is used twice. This second use can produce quite wrong results when the variable is non-normal and the maximum is close to the boundary of the parameter space (then the likelihood is quite asymmetric, and assuming symmetry is not adequate).
Using the normal approximation in both steps (using it in the second implies using it in the first step, too) makes the ML approach identical to the LS approach and there is has the same sensitivity to deviations from the assumptions as the LS approach.
Or the other way around: ML is the more general approach, and in the special case when the distribution of the response variable is assumed normal, there exist some neat short-cuts in the calculation circumventing the need of explicitly evaluating the likelihood to finding the maximum and the standard errors. This short-cut is called the LS approach.
  • asked a question related to Robustness
Question
3 answers
What sample size considerations are critical for achieving robust results in SEM, especially when using complex models with multiple latent variables?
Relevant answer
Answer
Here is a link to a widely cited article on sample size and SEM.
  • asked a question related to Robustness
Question
3 answers
Physics is the subject of non multiple explanations i.e one and robust, it drives part of 8ts prestige from this.
However, as gravity is at the same time explained as curvature of spacetime and an effect of the presence of mass, this principle has an exception.
What are the implications? I.e can science in the future be multi-explanation disci9line, provided that these explanations are simple, precise and minimum number of phenomena unificational!?
Relevant answer
Answer
That
“…Physics is the subject of non multiple explanations i.e one and robust, it drives part of 8ts prestige from this.….”
-looks as rather, if too, strange claim. Any/every scientific object/effect/process principally has only unique really scientific explanation; and in any/every science, including physics.
Correspondingly, as that is rigorously scientifically proven in the Shevchenko-Tokarevsky’s Planck scale informational physical model, in this case it is enough to read one of 2 main papers https://www.researchgate.net/publication/383127718_The_Informational_Physical_Model_and_Fundamental_Problems_in_Physics
- Matter’s spacetime is fundamentally absolute,
- and it is rigorously scientifically shown, that it is fundamentally flat, fundamentally continuous, and fundamentally “Cartesian”, (at least) [4+4+1]4D spacetime with metrics (at least) (cτ,X,Y,Z, g,w,e,s,ct), i.e. is only some (at least) [4+4+1]4D “empty container” that has infinite dimensions;
- which fundamentally cannot be impacted by anything in Matter, including “contracted”, “dilated”, “curved”, etc.; and fundamentally cannot impact on anything in Matter.
Really Gravity fundamentally is nothing else as some fundamental Nature force, and, as the other [Weak, Electric, Nuclear/Strong] Forces, which act in “flat” spacetime [that is another thing that the mainstream physics spacetime really is rather strange, but that is to certain extent outside this thread question], acts in the [4+4+1]4D “flat” spacetime above.
Moreover, Matter is rather simple logical construction, and, as that completely rigorously scientifically shown in the SS&VT model, more see section 6 in the link above, where initial Planck scale models of Gravity, Electric, and Nuclear Forces are presented, at least 3 these Forces act by the same scheme.
- are relevant to this thread question.
Cheers
  • asked a question related to Robustness
Question
4 answers
Modifying the original Feistel structure will it be feasible to design a lightweight and robust encryption algorithm. Somehow changing the structure's original flow and adding some mathematical functions there. I welcome everyone's view.
Relevant answer
Answer
Yes, it is indeed feasible to design a lightweight algorithm based on the Feistel structure. The Feistel network is a popular symmetric structure used in many modern cryptographic algorithms, such as DES (Data Encryption Standard). The design of a lightweight Feistel-based algorithm can effectively balance security and efficiency, making it suitable for environments with constrained resources, such as IoT devices and resource-limited systems.
Key Considerations for Designing a Lightweight Feistel-Based Algorithm
Feistel Structure Basics:
The Feistel structure divides the data into two halves and applies a series of rounds where the right half is modified using a function (often called the round function) combined with a subkey derived from the main key.
The left and right halves are then swapped after each round, employing the same round function iteratively over several rounds.
Lightweight Design Goals:
Reduced Resource Usage: The algorithm should minimize memory and processing requirements, which are crucial in lightweight applications.
Efficient Implementation: It should have efficient implementations in hardware (e.g., FPGAs, ASICs) as well as software (e.g., microcontrollers).
Security: While optimizing for lightweight design, the algorithm must maintain a sufficient level of security against common attacks (such as differential and linear cryptanalysis).
Steps in Designing a Lightweight Feistel Algorithm
Key Design Choices:
Number of Rounds: Determine the optimal number of rounds needed to achieve desired security without excessive computational cost. For lightweight applications, 4 to 8 rounds may be sufficient.
Block Size: Choose a block size that is suitable for the intended application. Smaller block sizes (e.g., 64 or 128 bits) may be appropriate for constrained environments.
Key Size: Develop a flexible key size that provides adequate security while keeping the implementation lightweight. A key size between 80 and 128 bits is commonly used for lightweight designs.
Round Function Design:
Simplicity and Efficiency: The round function should be computationally efficient, possibly utilizing modular arithmetic or simple logical operations (AND, OR, XOR) to enhance speed and reduce footprint.
Subkey Generation: Efficient and secure key scheduling is essential to generate round keys from the primary key, ensuring that each round has a unique key.
Attack Resistance:
Differential and Linear Cryptanalysis: Analyze the design for vulnerabilities to these forms of attacks. The choice of S-boxes in the round function can significantly enhance resistance.
Avalanche Effect: Ensure that a small change in the input or the key results in a significant change in the output.
Performance Optimization:
Implementation Flexibility: Design the algorithm to allow for easy adaptation for different platforms (hardware vs. software) to maximize performance.
Minimalistic Approach: Reduce unnecessary complexity in the algorithm to lower resource consumption, focusing on only essential component
Example Lightweight Feistel Structure
While developing a specific algorithm, you could consider a structure similar to the following:
function LightweightFeistelEncrypt(plaintext, key): Split plaintext into left (L0) and right (R0) For i from 1 to n (number of rounds): Ri = Li−1 XOR F(Ri−1, Ki) Li = Ri−1 return (Ln, Rn) function F(input, k): // Simple round function using lightweight operations // Example could include small S-boxes and XOR operations return output
  • asked a question related to Robustness
Question
2 answers
Hi everyone,
I am quite confused about the efficiency of ATE estimators. For example, AIPW (doubly robust) is more efficient than IPW if outcome regression is correctly specified.
But many papers (Bounded, efficient and doubly robust estimation with inverse weighting) also said IPW can achieve the semiparametric efficiency bound. Since bound is the lowest, does it mean IPW and AIPW have the same asymptotical variance?
Thanks for your help in advance!
Best,
Jun
Relevant answer
Answer
The efficiency bound for Average Treatment Effect (ATE) estimators in causal inference refers to the minimum variance that an unbiased estimator can achieve. This bound is determined by the Cramér-Rao lower bound and is influenced by the sample size, the model specification, and the quality of the data. It sets a benchmark for assessing the performance of different ATE estimators.
  • asked a question related to Robustness
Question
3 answers
Can someone suggest a robust and reliable PCR protocol for sex determination from mouse genomic DNA?
Relevant answer
Hello there
I suggest you to read this useful article which has detailed procedure about how carry out this.
  • asked a question related to Robustness
Question
9 answers
Long story short:
I use a long unbalanced panel data set.
All tests indicate that 'fixed effects' is more appropriate than 'random effects' or 'pooled OLS'.
No serial correlation.
BUT, heteroskedasticity is present, even with robust White standard errors.
Can someone suggest a way to either 'remove' or just 'deal' with heteroskedasticity in panel data model?
Relevant answer
Answer
To address heteroskedasticity in a panel data model:
  1. Use Robust Standard Errors: Apply heteroskedasticity-consistent standard errors (e.g., White’s robust standard errors).
  2. Transform Data: Consider log-transformations or other data transformations to stabilize variance.
  3. Apply Generalized Least Squares (GLS): Use GLS techniques that adjust for heteroskedasticity.
  4. Include Additional Variables: Add relevant covariates that might explain the variance in the residuals.
  5. Test and Adjust: Use tests like the Breusch-Pagan test to diagnose heteroskedasticity and adjust your model accordingly.
  • asked a question related to Robustness
Question
2 answers
The components as well as the type of data needed to design an electronic load controller for a 300kVA alternator(robust) for a mini hydro(150kVA).
Relevant answer
Answer
The country where demand is more than supply (generation) and with economical load dispatch microprocessore based (controller ) loads shedding are used, could refer my paper, Microprocessor based load shedding controller in journal IE(I) electrical Div …..
  • asked a question related to Robustness
Question
5 answers
I have identified many solutions. I need suggestion from somebody with application experience of this topic to identify the most reliable and robust procedure.
Relevant answer
Answer
I read you are in Chennai. I was many times there in the past to collaborate with NIOT. Many thanks again for your clear and complete suggestions.
Daniele
  • asked a question related to Robustness
Question
3 answers
Why does information theory explain aging, evolution vs creationism, critical rationalism, computer programming and much more?
Perhaps information has a very open definition thus, is very robust.
Relevant answer
Answer
All that is cut and paste salesmanship. How about some concrete examples of each?
  • asked a question related to Robustness
Question
4 answers
Help Needed! Impact of Sustainable Materials on Project Management Hi everyone, I'm working on my thesis titled: "Impact of Sustainable Materials on Project Management in the Construction Industry". I'm reaching out to construction professionals, project managers, green building experts, professors, lecturers, and students for your valuable insights! In this research, I'm exploring how the use of sustainable materials is influencing project management practices within the construction industry. Understanding these impacts will be crucial for promoting more sustainable building practices in the future. To contribute to my research, I've created a short survey that takes less than 10 minutes to complete. Your participation would be greatly appreciated! https://lnkd.in/d4Hj9aYr Who can participate? Construction professionals (architects, engineers, project managers) Green building experts and sustainability consultants Professors, lecturers, and students in construction management or related fields What are the benefits of participating? Your insights will contribute to valuable research on sustainable construction practices. You'll help to shape the future of project management in the construction industry. You'll receive a summary of the research findings once completed (optional). Sharing is caring! Please share this post with your network of construction professionals and anyone interested in sustainable building practices. The more responses I receive, the more robust the research will be.
Relevant answer
Answer
Very interesting survey) Thanks
  • asked a question related to Robustness
Question
4 answers
Dear Community,
I would like to develop and validate a qualitative NMR method for the analysis of a specific category of chemicals, and my question is the following: What are the criteria that I must assess?
I found a publication that states that the Limit of detection (LOD), the specificity and selectivity (NMR is inherently specific though), and Robustness must be investigated for a qualitative NMR method. But then what would be the experimental protocol to assess those criteria? For example: is there a need to perform the same experiment multiple times (on the same sample) and calculate the standard deviation for the estimation of the LOD, or just one experiment would be enough?
I would be more than grateful if someone experimented with NMR method development could provide more details on the subject.
Thank you.
Relevant answer
Answer
It is more the second scenario, I will be looking for the presence of specific analytes (which are in the number of millions) by combining, 1D and 2D, 1H and 31P based NMR experiments in an attempt to reveal their signals.
samples can be anything from aqueous to organic solvents with some typical contaminants (eg: diesel,PEGs, other non-pertinent chemicals ...etc).
Thank you.
  • asked a question related to Robustness
Question
3 answers
Reversed flow on 28 faces of pressure-outlet 7.
Stabilizing mp-x-momentum to enhance linear solver robustness.
Stabilizing mp-x-momentum using GMRES to enhance linear solver robustness.
Stabilizing mp-y-momentum to enhance linear solver robustness.
Stabilizing mp-y-momentum using GMRES to enhance linear solver robustness.
Stabilizing k to enhance linear solver robustness.
Stabilizing k using GMRES to enhance linear solver robustness.
Divergence detected in AMG solver: k Stabilizing epsilon to enhance linear solver robustness.
Stabilizing epsilon using GMRES to enhance linear solver robustness.
Divergence detected in AMG solver: epsilon Stabilizing flue-gas-species-0 to enhance linear solver robustness.
Stabilizing flue-gas-species-1 to enhance linear solver robustness.
Stabilizing temperature to enhance linear solver robustness.
Stabilizing temperature using GMRES to enhance linear solver robustness.
Stabilizing vof-1 to enhance linear solver robustness.
absolute pressure limited to 5.000000e+10 in 2708 cells on zone 4
turbulent viscosity limited to viscosity ratio of 1.000000e+05 in 7089 cells
Divergence detected in AMG solver: k
Divergence detected in AMG solver: epsilon
Divergence detected in AMG solver: k
Divergence detected in AMG solver: epsilon
Divergence detected in AMG solver: k
Divergence detected in AMG solver: epsilon
Divergence detected in AMG solver: k
Divergence detected in AMG solver: epsilon
Divergence detected in AMG solver: k
Divergence detected in AMG solver: epsilon
Divergence detected in AMG solver: k
Divergence detected in AMG solver: epsilon
Divergence detected in AMG solver: k
Divergence detected in AMG solver: epsilon
Divergence detected in AMG solver: k
Divergence detected in AMG solver: epsilon
Divergence detected in AMG solver: k
Divergence detected in AMG solver: epsilon
Divergence detected in AMG solver: k
Divergence detected in AMG solver: epsilon
Divergence detected in AMG solver: k
Divergence detected in AMG solver: epsilon
Error at host: floating point exception
===============Message from the Cortex Process================================
Compute processes interrupted. Processing can be resumed.
==============================================================================
Error at Node 2: floating point exception
Error at Node 5: floating point exception
Error at Node 3: floating point exception
Error at Node 9: floating point exception
Error at Node 1: floating point exception
Error at Node 4: floating point exception
Error at Node 7: floating point exception
Error at Node 11: floating point exception
Error at Node 0: floating point exception
Error at Node 6: floating point exception
Error at Node 8: floating point exception
Error at Node 10: floating point exception
Relevant answer
Answer
Thank you all for your suggestion.
  • asked a question related to Robustness
Question
1 answer
a) Design a scalable and robust network architecture that can handle the increasing data traffic and support various communication technologies.
b) Recommend suitable transmission technologies, such as fiber optics, microwave of satellite, based on factors like bandwidth requirements, distance coverage and reliability.
c) Incorporate robust security measures to protect the network against cyber threats and ensure high reliability through redundancy and backup systems.
Relevant answer
Arun Yadav Current solutions to achieve flexible, secure, scalable, highly available and economical networks are based on SDN technology and in the case of distant multisite links, SDN-WAN with layer 2 Overlay technologies over the Internet.
These solutions make it possible to have different concurrent Internet access technologies or (WAN) and different operators simultaneously and offer a private layer 2 (Ethernet) plane independent of transport networks and operators. They offer ring redundancy utilities with DualHoming over the internet, bandwidth aggregation and NVF redundancy such as VirtualSwitch.
  • asked a question related to Robustness
Question
1 answer
Hello all,
I'm starting a project where we want to automate the video analysis of people working in different environments to produce ergonomic measures like hip flexion, shoulder extension, etc. We already tested some AI-based libraries like OpenPose, MMPose, Mediapipe, etc.
One of the project's objective was to estimate the precision and robustness of these vision solutions compared to some ground truth obtained by physical MoCap systems like the Movella system, previously known as XSens.
The problem is that the price of the Movella system, 8,500USD for hardware and 13,500USD/year for software, is way too high for our limited expected usage (maybe 1 month?). Do you know of some other MoCap systems that might be appropriate for this usage?
Note: We don't need a system so robust as to resist a fight scene motion capture.
Relevant answer
Answer
Dear Martin
Thank you for your question, it enforces me to read and know more about this topic.
I always use mobile camera and Kinovea computer software to analyze posture and motions, I do not know if this will help you.
  • asked a question related to Robustness
Question
1 answer
1)
Preprint Nuance
2)
Preprint Nuance 2
Relevant answer
Answer
Yes, some theories can be considered too robust to risk betting against due to their extensive empirical support, explanatory power, and predictive accuracy. Here’s why and how certain theories become so resilient:
Characteristics of Robust Theories:
  1. Empirical Evidence:Robust theories are typically supported by a wealth of empirical evidence from multiple studies across different contexts. This evidence consistently validates the predictions and hypotheses derived from the theory. Example: The theory of evolution by natural selection is supported by extensive evidence from paleontology, genetics, molecular biology, and observational studies in ecology.
  2. Explanatory Power:These theories provide comprehensive explanations for a wide range of phenomena within their domain. They integrate disparate observations into a coherent framework that enhances understanding and insight. Example: The theory of relativity (both special and general) explains diverse physical phenomena such as the behavior of light, gravity, and the structure of the universe.
  3. Predictive Accuracy:Robust theories have predictive power—they accurately forecast future observations and outcomes based on their principles and laws. This predictive capability enhances their credibility and utility. Example: Quantum mechanics accurately predicts the behavior of subatomic particles and has enabled technological advancements such as semiconductor devices and quantum computing.
  4. Consensus Among Experts:There is broad consensus among experts and researchers in the field regarding the validity and reliability of robust theories. This consensus reflects rigorous testing, peer review, and validation processes. Example: The germ theory of disease, which posits that microorganisms are the cause of many infectious diseases, is widely accepted in medical science due to overwhelming evidence and consensus.
Reasons They Are Difficult to Bet Against:
  1. High Confidence Level:The accumulation of evidence and the robustness of testing over time instill a high level of confidence in these theories. They have withstood scrutiny, challenges, and attempts at falsification. Example: Climate change theory, supported by extensive climate data, modeling, and interdisciplinary research, is robust against skepticism due to its consistent findings across different scientific disciplines.
  2. Utility and Applications:Robust theories often underpin practical applications and technological innovations. Their reliability and predictive accuracy make them indispensable for advancing knowledge and driving progress in various fields. Example: Newton's laws of motion and gravitation are fundamental to engineering, astronomy, and space exploration, providing the basis for designing spacecraft trajectories and satellite orbits.
  3. Continual Testing and Refinement:Despite their robustness, theories are continually tested, refined, and sometimes modified in response to new evidence or anomalies. This dynamic process ensures theories remain relevant and accurate. Example: Darwinian evolution has evolved with new insights from genetics, molecular biology, and ecology, enriching our understanding of evolutionary mechanisms over time.
Conclusion:
In conclusion, robust theories are grounded in substantial empirical evidence, possess strong explanatory power, demonstrate predictive accuracy, and enjoy widespread consensus among experts. These qualities make them highly reliable and difficult to bet against because they have repeatedly demonstrated their ability to withstand scrutiny and provide reliable frameworks for understanding the natural world. However, the scientific process encourages ongoing evaluation and refinement, ensuring theories remain dynamic and responsive to new discoveries and challenges.
  • asked a question related to Robustness
Question
4 answers
Hello,
I have a question regarding the interpretation of the results from an experiment I conducted. Each participant answered 4 questions measuring motivation, satisfaction, help, and collaboration (my dependent variables) in 7 different scenarios (my independent variables). To analyze my results, I used three methods: a Wilcoxon signed rank test, a regression with standard errors clustered at the individual level - CRSE (to control for individual heterogeneity), and an ordinal regression (using GENLIN ) to account for the ordinal nature of the dependent variable.
The aim of this analysis was to verify if the significant results obtained with the Wilcoxon test were consistent across the other two methods. I conclude that significant results found with the Wilcoxon test, if they are also significant in the other two regressions, are robust.
Conversely, if an effect is significant in the Wilcoxon test and in the regression with CRSE ( standard errors cluster at the individual level), but not in the ordinal regression (GENLIN ordinal), I consider that this is not a robust effect, indicating that the result is not consistent across the three tests, this indicates that there is an indication of the effect, but that this indication is weak.
I am wondering how to properly interpret this ? What does it really mean ?
For the majority of my results, they are robust, but I have some scenarios where significant effects on certain dependent variables are no longer significant in the ordinal regression, but are in the Wilcoxon test and the regression with clustered standard errors. I am wondering why this happens and how to explain it.
I am working with SPSS version 27. Could you help me better understand these results and their interpretation?
Thank you in advance for your help.
Relevant answer
Answer
really, my personal opininion is that the whole approach does not make any sense at all.
1) The different models test different statistical hypothesis, therefore, even IF all three approaches would show a significant result, this does not mean that the results converge, nor that the general result is robust. You cannot conclude that, its like comparing apples with pears.
2) You state yourself that the DV is ordinal, in that case using an approach for metric variables, even if you account for clustered data, is not optimal and again, tests something different than the Wilcoxon test.
3) Which ordinal approach did you use, logit or probit regression? I never used it with SPSS, therefore, I do not know the options there.
4) Again, you stated yourself, that you did not account for the clustered data with your ordinal approach. How do you think is the result comparable to the other two?
5) You say: "I do know how to do an ordinal regression with clustered data on spss" Did you mean that you dont know how to do it? If yes, why did you do it in the first place, if you already knew that it is the wrong approach? I am really puzzled..... basically, you knew from the beginning that 2 out of 3 analyses are not suitable for your data (one is not for ordinal data and the other does account for clustered data), what do you expect will the results tell you? Garbage in, garbage out.
Without knowing more, I would use a probit regression, to account for the ordinal data structure. The model assumes a normally distributed latent variable (it uses the normal CDF instead of the logistic CDF, but makes the interpretation easier), which seems reasonable, if you already used a OLS regression and thougt it will work. To account for the clustered data (repeated measures), I would use a linear mixed model (aka. multilevel model). I dont know if this is possible in SPSS, but surely in R with the ordinal package (I havent used it so far, but is capable of cumulative linear mixed models and uses the lme4 syntax) or go Bayesian (I would recommend) and use the brms package, which also uses the lme4 syntax.
  • asked a question related to Robustness
Question
4 answers
Molecular dynamics simulation web servers
Relevant answer
Answer
Davide Pietrafesa Ok Thank you!
  • asked a question related to Robustness
Question
2 answers
How does the application of generative adversarial networks (GANs) for data augmentation impact the robustness and accuracy of image classification models?
Relevant answer
Answer
Well, I think of these two viewpoints, the application of Generative Adversarial Networks (GANs) for data augmentation significantly enhances the robustness and accuracy of image classification models by generating diverse and realistic synthetic data. GANs operate through a dual-network system: a generator that creates synthetic images and a discriminator that evaluates their authenticity. This dynamic interaction enables the generation of high-quality, varied images that closely resemble real-world data, thereby enriching the training dataset. As a result, the model can generalize better, learning to recognize a wider range of features and reducing overfitting. This leads to improved robustness as the model becomes adept at handling variations and anomalies in real-world data.
Also, the diversity introduced by GAN-generated images plays a critical role in boosting the accuracy of classification models. Traditional data augmentation techniques, such as rotations and flips, often lack the complexity to simulate real-world variations adequately. In contrast, GANs can create entirely new samples that capture intricate details and subtle differences, expanding the effective training set beyond the limitations of manual augmentation. This comprehensive training helps the model achieve higher accuracy, as it is exposed to a broader spectrum of examples, thereby improving its predictive performance on unseen data. Overall, the integration of GANs for data augmentation represents a significant advancement in the development of more robust and accurate image classification models.
References
I hope this gives you a starting perspective.
Shafik
  • asked a question related to Robustness
Question
2 answers
Hello.
I'm currently working on a control system for a doubly fed induction generator (DFIG) as part of my thesis project. Traditionally, fuzzy logic controllers (FLCs) use the error (e) and the derivative of error (\frac{de}{dt}) as inputs. However, in my implementation, I decided to use the integral of the error (\frac{1){s}) instead of the derivative after reading that it's possible in a certain textbook. Surprisingly, this approach has yielded very good results in my simulations. Despite the positive outcomes, my thesis supervisor mentioned that they had never encountered the integral of the error being used as an input in FLCs before. To ensure the robustness and academic validity of my approach, I need to back it up with some literature or resources that discuss this methodology.
Has anyone here used the integral of error in their fuzzy logic controllers, or come across any papers or textbooks that mention this practice? Any guidance, references, or suggestions would be immensely helpful.
Thanks in advance for your help!
Relevant answer
for each system to keep nit stable and avoid disability of execution we have to take output signal and make it as negative feedback then divide the output to input should give 1 that is reflect the efficiency of the system so i cant confirm i can use error in fuzzy control which i can use by multiplexing or not gate.
  • asked a question related to Robustness
Question
3 answers
What is robust load balancing in high-performance distributed computing systems? And what solutions do you suggest for it?
Relevant answer
In high-performance distributed computing systems, robust load balancing refers to the process of efficiently and reliably distributing workloads across multiple computing resources (like servers, processors, or clusters) to optimize overall system performance, minimize latency, and ensure high resource utilization. Robust load balancing is crucial because it can handle various challenges such as uneven traffic patterns, server failures, and changing system conditions while maintaining performance and reliability.
Challenges in Robust Load Balancing:
  1. Dynamic Workload: The system must adjust in real time to changes in the distribution and intensity of incoming tasks.
  2. Fault Tolerance: It must manage sudden resource failures without significantly impacting performance.
  3. Heterogeneous Resources: Different machines might have varying performance capabilities, making it challenging to allocate workloads uniformly.
  4. Scalability: As the system scales up, the load balancing mechanism must also scale efficiently.
Solutions and Approaches:
  1. Static Load Balancing:Predefined algorithms distribute the tasks based on known resource capabilities. Example: Round-robin, least-loaded, or weighted distribution.
  2. Dynamic Load Balancing:Decisions are made based on current workload information. Example: Work-stealing, dynamic task queues.
  3. Hierarchical Load Balancing:Combines static and dynamic methods. Local nodes balance their loads independently, with a higher-level load balancer handling cross-node balancing. Example: Multi-tier architectures.
  4. Distributed Load Balancing:Load balancers are decentralized and each node makes its own decisions using local information. Example: Gossip protocols.
  5. Adaptive Algorithms:The load balancer adjusts its strategy based on changing network conditions. Example: Predictive algorithms using machine learning for workload forecasting.
  6. Load Balancing Middleware:Middleware layers can help offload the complexity of load balancing by providing automatic resource management. Example: Apache Kafka for stream processing.
  7. Cloud-based Auto-scaling:Cloud platforms like AWS, Azure, and GCP offer managed load balancing services with automatic scaling based on demand. Example: AWS Elastic Load Balancer, Google Cloud Load Balancer.
  • asked a question related to Robustness
Question
1 answer
Hello,
I am using movestay command for ESR model analysis, but I get an error message.
Can anyone help me please?
movestay (ln_wage = $x), select(union= $x msp) vce (robust)
The error message is:
Fitting initial values .....initial vector: copy option requires either a matrix or a list of numbers
r(198);
Thanks in advance for your kindness
Relevant answer
Answer
I assume 'map' is the instrumental variable, thus, use the following:
movestay (ln_wage = $x), select(union= msp) robust cluster(ID)
  • asked a question related to Robustness
Question
7 answers
Hello everyone
1. Please suggest a robust free software to analysis the XRD results to obtain the corresponding 3d structure of a protein.
2. Is the xrd diffractogram with only a single peak better than a diffractogram with multiple peaks or vice versa? And what are the reasons?
3. What does raw.file show after xrd analysis?
Thanks to all
Relevant answer
Answer
Xpert High score software
  • asked a question related to Robustness
Question
1 answer
I am working on Battery Pack Thermal Analysis for an Electric Vehicle. Whenever I get to the point of pressing "Run Calculation", the solver displays 1 iteration only then displays some info and "Floating Point Exception". Here are the messages displayed in the console:
iter energy uds-0 uds-1 time/iter
1 1.6096e-07 7.8111e-07 7.7842e-07 0:00:40 10
Stabilizing temperature to enhance linear solver robustness.
Stabilizing temperature using GMRES to enhance linear solver robustness.
Divergence detected in AMG solver: temperature Stabilizing uds-0 to enhance linear solver robustness.
Stabilizing uds-0 using GMRES to enhance linear solver robustness.
Divergence detected in AMG solver: uds-0 Stabilizing uds-1 to enhance linear solver robustness.
Stabilizing uds-1 using GMRES to enhance linear solver robustness.
Divergence detected in AMG solver: uds-1
Divergence detected in AMG solver: temperature
Divergence detected in AMG solver: uds-0
Divergence detected in AMG solver: uds-1
Divergence detected in AMG solver: temperature
Divergence detected in AMG solver: uds-0
Divergence detected in AMG solver: uds-1
Divergence detected in AMG solver: temperature
Divergence detected in AMG solver: uds-0
Divergence detected in AMG solver: uds-1
Divergence detected in AMG solver: temperature
Divergence detected in AMG solver: uds-0
Divergence detected in AMG solver: uds-1
Divergence detected in AMG solver: temperature
Divergence detected in AMG solver: uds-0
Divergence detected in AMG solver: uds-1
Error at host: floating point exception
===============Message from the Cortex Process================================
Compute processes interrupted. Processing can be resumed.
==============================================================================
Error at Node 3: floating point exception
Error at Node 2: floating point exception
Error at Node 5: floating point exception
Error at Node 4: floating point exception
Error at Node 1: floating point exception
Error at Node 0: floating point exception
Error: floating point exception
Error Object: #f
I am new to Ansys Fluent, so I will be glad if anyone helps me throughout this problem. Thanks in advance!
Relevant answer
well, you have two user defined scalar, they uses a transport equation with transient, advection, disusion, and sources.
All UDS are solved AFTER the principal linear system (P-velocity, turbulence, energy, and so on...) You need be carefull with Courant number for advcetive transport of these scalars (a hypernbolic behavior). Co<1 or less if possible .
Other possibility is the Sources if they do exits, Souces can create grow up or down vary fastly with a good flow time step ( the critical time step is given lower time scale in your system). A good practice, is to create a lienar expansion of you source S= So - Sp*Phi ( Phi is your scalar) This is good for diagonal dominance and matriz contidiong - anf for AMG and GMRES solver.
GMRES troubles gereraly means that you problem is not convex - there are a lot of hills and valleys in th Rn surface solution field - generated from the sources and a lot of couling , no lineariaties, etc...
Try to change you Time Step, and turn-off sources iff they are there. If it works, you'll knows where to refine your numerical tunning
Do not fotget the relaxationsm and solver parameters.... they helps a lot if know what they do!
Good look!
  • asked a question related to Robustness
Question
2 answers
I am searching for and trying to develop an interesting project topic for my MSc. thesis that relates to data acquisition, sensors, or anyone relating to robust or predictive control.
Relevant answer
Answer
Perhaps you can have a look at this :
  • asked a question related to Robustness
Question
1 answer
What is the best technique to employ as a robustness check with wavelet coherence to investigate the impact of uncertainty on the stock market ( weekly data)?
Relevant answer
Answer
Here are a few techniques you can employ:
1. Bootstrap Analysis: Conduct a bootstrap analysis to assess the stability and significance of the wavelet coherence results. The bootstrap technique involves randomly resampling your data multiple times, estimating the wavelet coherence for each resampled dataset, and then constructing confidence intervals or p-values based on the distribution of the coherence estimates. This helps determine if the observed coherence values are statistically significant or if they could have occurred by chance.
2. Cross-Wavelet Analysis: Perform cross-wavelet analysis to examine the relationship between uncertainty and the stock market at different time scales. Cross-wavelet analysis allows you to identify time-varying associations between the two variables across different frequencies. By visualizing and analyzing the cross-wavelet coefficients, you can gain insights into how the relationship between uncertainty and the stock market may change over time.
3. Multiple Uncertainty Measures: Consider using multiple measures of uncertainty to investigate their impact on the stock market. This can include various indicators such as volatility indices (e.g., VIX), economic policy uncertainty indices (e.g., EPU index), or other relevant proxies for uncertainty. Analyzing the coherence between different measures of uncertainty and the stock market can provide a more comprehensive understanding of their relationship.
4. Robustness Across Different Time Windows: Explore the robustness of the wavelet coherence results by varying the time window size. Instead of using the entire time series, divide the data into multiple overlapping or non-overlapping sub-periods and estimate the wavelet coherence for each sub-period. This helps assess if the observed coherence is consistent across different time windows and strengthens the validity of your findings.
5. Control Variables: Consider including control variables in your analysis to account for potential confounding factors that may influence the relationship between uncertainty and the stock market. For example, macroeconomic variables like interest rates, GDP growth, or inflation rates can be included as control variables to mitigate the effects of broader economic factors.
Hope it helps
  • asked a question related to Robustness
Question
1 answer
I need to construct a robust phylogeny based on the core genome of the strains. I have around 61 genomes of the strains. I want to know which bioinformatics tool is needed ?
Relevant answer
Answer
Follow these steps:
1. Genome Assembly: Start by assembling the genomes of your 61 samples from the raw sequencing data. This can be done using genome assembly tools such as SPAdes, Velvet, or IDBA-UD.
2. Core Genome Alignment: Once you have assembled the genomes, you need to identify the core genes shared among all samples. Tools like Roary or PanX can help you identify the core genes by comparing the annotated gene sets across genomes. These tools generate a core gene alignment, which represents the conserved regions across all samples.
3. Multiple Sequence Alignment: Next, perform a multiple sequence alignment of the core gene sequences obtained in the previous step. Tools like MAFFT, MUSCLE, or ClustalW can be used for this purpose.
4. Phylogenetic Tree Construction: Once you have the multiple sequence alignment, you can construct a phylogenetic tree based on the aligned core gene sequences. There are several software packages available for this task, including RAxML, IQ-TREE, and FastTree. These tools use algorithms such as maximum likelihood or Bayesian inference to estimate the evolutionary relationships and construct the tree.
5. Tree Visualization: Finally, you can visualize and annotate the constructed phylogenetic tree using tree visualization tools such as FigTree, iTOL, or Dendroscope. These tools allow you to explore and customize the tree display, add metadata, and highlight specific branches or clusters of interest.
Hope it helps
  • asked a question related to Robustness
Question
4 answers
Dear researchers,
Is it possible to consider Cost-Benefit Analysis (CBA) as a suitable form of robustness check in econometric analysis, taking into account its effectiveness in assessing the resilience and reliability of the findings?
There are lots of methods to check the robustness in a regression, like changing the key variables, using another econometric methods, etc. However, is it possible to use the results calculated by CBA to prove the results induced by econometric analysis? The results of CBA could be some additional evidence to defend the conclusion of econometrics.
Relevant answer
Answer
Yes, it can be an appropriate form as a robustness check, but it has limitations such as quantifying all the costs and benefits of a project or policy, particularly those that are intangible or difficult to measure.
  • asked a question related to Robustness
Question
1 answer
. xtabond2 IndicedeTheil PIBparhabitantUSconstants InflationdéflateurduPIBa ITC Créditintérieurfourniausecte, gmm( IndicedeThei
> l, lag(2 4)collapse) iv( PIBparhabitantUSconstants InflationdéflateurduPIBa ITC Créditintérieurfourniausecte) twostep small n
> odiffsargan
Favoring space over speed. To switch, type or click on mata: mata set matafavor speed, perm.
Dynamic panel-data estimation, two-step system GMM
------------------------------------------------------------------------------
Group variable: codepays Number of obs = 171
Time variable : Année Number of groups = 10
Number of instruments = 9 Obs per group: min = 12
F(4, 9) = 559.85 avg = 17.10
Prob > F = 0.000 max = 20
----------------------------------------------------------------------------------------------
IndicedeTheil | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-----------------------------+----------------------------------------------------------------
PIBparhabitantUSconstants | .0000781 .0000141 5.53 0.000 .0000461 .00011
InflationdéflateurduPIBa | -.0137815 .0021388 -6.44 0.000 -.0186197 -.0089432
ITC | .0045328 .0029732 1.52 0.162 -.002193 .0112587
Créditintérieurfourniausecte | -.0241084 .0106227 -2.27 0.049 -.0481386 -.0000781
_cons | 3.522798 .8558666 4.12 0.003 1.586694 5.458903
----------------------------------------------------------------------------------------------
Warning: Uncorrected two-step standard errors are unreliable.
Instruments for first differences equation
Standard
D.(PIBparhabitantUSconstants InflationdéflateurduPIBa ITC
Créditintérieurfourniausecte)
GMM-type (missing=0, separate instruments for each period unless collapsed)
L(2/4).IndicedeTheil collapsed
Instruments for levels equation
Standard
PIBparhabitantUSconstants InflationdéflateurduPIBa ITC
Créditintérieurfourniausecte
_cons
GMM-type (missing=0, separate instruments for each period unless collapsed)
DL.IndicedeTheil collapsed
------------------------------------------------------------------------------
Arellano-Bond test for AR(1) in first differences: z = -0.35 Pr > z = 0.724
Arellano-Bond test for AR(2) in first differences: z = -0.68 Pr > z = 0.498
------------------------------------------------------------------------------
Sargan test of overid. restrictions: chi2(4) = 141.46 Prob > chi2 = 0.000
(Not robust, but not weakened by many instruments.)
Hansen test of overid. restrictions: chi2(4) = 5.04 Prob > chi2 = 0.283
(Robust, but weakened by many instruments.)
Relevant answer
Answer
The Arellano-Bond test is used to test for autocorrelation in the first-differenced errors of a dynamic panel data model. If the test does not reject the null hypothesis of no autocorrelation, it suggests that the model is correctly specified and that there is no serial correlation in the errors.
  • asked a question related to Robustness
Question
2 answers
I perfomed 2SLS ,
In the robust version i found the endogeneity , but did not found in non robust version.
The results in robust version is validy? need your help.
Non Robust options
Tests of endogeneity
H0: Variables are exogenous
Durbin (score) chi2(1) = .242302 (p = 0.6225)
Wu-Hausman F(1,613) = .227544 (p = 0.6335)
. estat overid
Tests of overidentifying restrictions:
Sargan (score) chi2(1) = .035671 (p = 0.8502)
Basmann chi2(1) = .033487 (p = 0.8548)
. estat firststage, all
First-stage regression summary statistics
--------------------------------------------------------------------------
| Adjusted Partial
Variable | R-sq. R-sq. R-sq. F(2,613) Prob > F
-------------+------------------------------------------------------------
TURN_1 | 0.1681 0.1152 0.0632 20.6714 0.0000
--------------------------------------------------------------------------
Shea's partial R-squared
--------------------------------------------------
| Shea's Shea's
Variable | partial R-sq. adj. partial R-sq.
-------------+------------------------------------
TURN_1 | 0.0632 0.0036
--------------------------------------------------
Minimum eigenvalue statistic = 20.6714
Critical Values # of endogenous regressors: 1
H0: Instruments are weak # of excluded instruments: 2
---------------------------------------------------------------------
| 5% 10% 20% 30%
2SLS relative bias | (not available)
-----------------------------------+---------------------------------
| 10% 15% 20% 25%
2SLS size of nominal 5% Wald test | 19.93 11.59 8.75 7.25
LIML size of nominal 5% Wald test | 8.68 5.33 4.42 3.92
---------------------------------------------------------------------
Robust options
  • Tests of endogeneity
  • H0: Variables are exogenous
  • Robust score chi2(1) = 2.99494 (p = 0.0835)
  • Robust regression F(1,613) = 2.77036 (p = 0.0965)
  • . estat overid, forcenonrobust
  • Tests of overidentifying restrictions:
  • Sargan chi2(1) = .035671 (p = 0.8502)
  • Basmann chi2(1) = .033487 (p = 0.8548)
  • Score chi2(1) = .514465 (p = 0.4732)
  • . estat overid
  • Test of overidentifying restrictions:
  • Score chi2(1) = .514465 (p = 0.4732)
  • . estat firststage, all
  • First-stage regression summary statistics
  • --------------------------------------------------------------------------
  • | Adjusted Partial Robust
  • Variable | R-sq. R-sq. R-sq. F(2,613) Prob > F
  • -------------+------------------------------------------------------------
  • TURN_1 | 0.1681 0.1152 0.0632 13.7239 0.0000
  • --------------------------------------------------------------------------
  • Shea's partial R-squared
  • --------------------------------------------------
  • | Shea's Shea's
  • Variable | partial R-sq. adj. partial R-sq.
  • -------------+------------------------------------
  • TURN_1 | 0.0632 0.0036
  • --------------------------------------------------
Relevant answer
The difference between robust and non-robust versions in regression analysis typically refers to the handling of heteroskedasticity in the error terms.
Robust Version: Adjusts for heteroskedasticity in the error terms. If you find evidence of endogeneity using a robust version, it means that after accounting for non-constant variance in the residuals (heteroskedasticity), there's still a correlation between the independent variable and the error term.
Non-Robust Version: Assumes constant variance in the error terms (homoskedasticity). If you don't find evidence of endogeneity in the non-robust version, it means the initial test did not detect correlation between the independent variable and the error term under the assumption of homoskedasticity.
In your 2SLS regression, the robust test indicating endogeneity suggests that after accounting for potential heteroskedasticity, the instrument may not be as valid as assumed. The non-robust result can be misleading if there's heteroskedasticity in the model. It's generally safer to trust the robust version, especially if there's a reason to suspect non-constant variance in the residuals.
  • asked a question related to Robustness
Question
2 answers
Could you recommend the most effective Python libraries for Machine Learning, such as TensorFlow, scikit-learn, and PyTorch, which empower developers with efficient tools for building robust models?
Relevant answer
Answer
DEAR Abu Rayhan
There are many machine learning libraries available, and each has its own unique set of features and capabilities. Some of the most popular machine learning libraries include NumPy, Matplotlib, Pandas, Scikit-Learn, TensorFlow, PyTorch, and Keras.
In breif:
  • Numpy.
  • Scipy.
  • Scikit-learn.
  • Theano.
  • TensorFlow.
  • Keras.
  • PyTorch.
  • Pandas
  • asked a question related to Robustness
Question
2 answers
I have calculated a robust 2x3 mixed anova in R (with the package WRS2).
Now I wanted to calculate the effect sizes. However, I can't find anywhere how to calculate them for the robust anova. Does anyone know a function in R with which this is possible?
Relevant answer
Answer
Hello Nele,
Given that results likely depend on the degree of departure from the usual linear model assumptions manifest in the data set, I think the only way to be certain of sample size would be to run simulations with sample distributions corresponding to the likely or worst case scenarios you could envision. You might find this link to be helpful: https://aaroncaldwell.us/SuperpowerBook/
If that seems too daunting, then compute a priori sample sizes for ordinary mixed anova (assumptions met) and use these values as a lower bound.
Good luck with your work.
  • asked a question related to Robustness
Question
3 answers
Hi everyone,
I have an dependent ordinal variable (five point Likert item) and a nominal grouping variable with five categories and want to test for differences in the responses on the dependent variables between the categories.
For that purpose I considered applying a Kruskal-Wallis with subsequent Dunn's tests.
The problem is that by inspecting of the boxplots the distributions of the responses differ quite a much between groups.
How robust is KW to the violation of the sameness of distribution shape assumption?
What are possible alternatives to the KW in this case?
Relevant answer
Answer
Stefan Becker, just one more thing, along with robustness of a test, the power-of-a-test might also help when choosing among different statistical tests - good luck.
  • asked a question related to Robustness
Question
2 answers
Does the issue of multicollinearity affect the reliability, interpretation, and robustness of a mediation analysis?
Relevant answer
Answer
The effects of multicollinearity/redundancy would be similar as in multiple regression analysis since a mediation model consists of a series of regression equations. For example, you might end up with large standard errors or suppressor effects. See, e.g.,
Maassen, G. H., & Bakker, A. B. (2001). Suppressor variables in path models: Definitions and interpretations. Sociological Methods & Research, 30(2), 241-270.
MacKinnon, D. P., Krull, J. L., & Lockwood, C. M. (2000). Equivalence of the mediation, confounding and suppression effect. Prevention Science, 1, 173-181.
  • asked a question related to Robustness
Question
4 answers
I want to use these methods in SPSS or MATLAB software, but I don't know how to do it, can anyone help me?
If you know a video or site, please send it to me.
If you know any other software that can do these methods easily, please tell me.
Relevant answer
Answer
Thank you so much
Can I put fuzzy numbers in this model or not?
And can you explain what the input parameters are ?
  • asked a question related to Robustness
Question
3 answers
Adversarial attacks exploit vulnerabilities in AI models, leading to incorrect predictions. Developing robust defense mechanisms is essential to safeguard AI systems from such threats.
Relevant answer
Answer
Protecting AI systems from adversarial attacks is an invigorating challenge that demands strategic solutions. To fortify our beloved AI, we must embrace a multi-faceted approach. Firstly, implement robust defense mechanisms like adversarial training, ensuring the model is familiar with potential threats. Secondly, augment the AI's understanding through diverse and extensive data, allowing it to discern adversarial inputs from genuine ones. Embrace the power of ensemble models, amalgamating several diverse architectures, fortifying the AI's resilience. Lastly, continually scrutinize and upgrade the AI's defense, staying one step ahead of malicious actors. Embrace this quest with unbridled enthusiasm, for we shall triumph and revel in AI's security!
  • asked a question related to Robustness
Question
3 answers
AI-generated fake content, including "deepfake" videos, poses significant challenges to trust, media integrity, and the spread of misinformation, necessitating robust detection and verification mechanisms.
Relevant answer
Answer
Perfect example is BOT accounts.
  • asked a question related to Robustness
Question
3 answers
I am working to increase the dynamic range of a sandwich ELISA and the robustness. I have worked on the capture antibody concentration, primary and secondary antibodies concentration, the plastic, the blocking, coating and washing buffers. The robustness is now OK by changing the buffers but the dynamic range stay still very low. Have you got any ideas?
Relevant answer
Answer
homo Arginine Elisa kit.
  • asked a question related to Robustness
Question
2 answers
Dears,
I am running a var (1) model. My optimal lag selection criteria are giving me different optimal lags: AIC is giving me 4 lags as most of the other tests are. SBIC, however, is giving me an optimal lag of 1. I tried all the options, and seems my model is not significant when i increase my lags as shown by AIC and other tests. if I use SBIC, my model is significant and results look reasonable. However, my VAR (1) is serially auto correlated and probably suffering from heteroskedasticity. Could you please any help on how test for heterskedasticity for VAR system? Is it possible to run HAC Var model and make the errors robust both for heteroskedasticity and serial auto correlation? I am using STATA.
Thanks!
Relevant answer
Answer
My answer is coming late but it might be of use to the wide academic community. You need to first estimate the VAR model with the suggested lag length from any of the selection criteria. Secondly, test the model for serial/autocorrelation. If the model fails the test, increase the lag length by one and repeat the entire process until the condition is resolved.
  • asked a question related to Robustness
Question
3 answers
Hello, I conducted 2-Stage System GMM in the STATA program. However, I couldn't find a code to adjust the number of instrumental variables. If there exists such a code, could you share it, please? Where should I add this code in my main code line?
Thanks in advance..
My main code line
xtabond2 X1 L.X1 X2 lnX3 lnX9 lnX15 lnX16, gmm(L.X1, collapse) twostep robust nomata iv(L.X2 L.X3 L.lnX9 L.lnX15 L.lnX16)
Relevant answer
Answer
Hi Dear
You can use below model to increase or decrease the instrument variables
xtabond2 DV L.DV IV1 IV2....,gmmstyle(DV, laglimits (2 .) eq(diff)) ivstyle(IV1 IV2..., eq(diff)) twostep
or
xtabond2 DV L.DV IV1 IV2....,gmmstyle(DV, laglimits (2 .) eq(both)) ivstyle(IV1 IV2... , eq(both)) twostep
You can change the lag limits according to your requirements. For example, laglimits (2 1) or (2 2)
  • asked a question related to Robustness
Question
9 answers
Most empirical papers use a single econometric method to demonstrate a relationship between two variables. For robustness, is not it safer to use a variety of methods to conclude (cointegration IV models with thresholds, wavelet)?
Relevant answer
Answer
Robustness measures the degree of invariance of some feature F across some class of entities E. Sensitivity analyses are forms of robustness measurement (F = model outcome, E = model inputs/parameters/structure variations).
  • asked a question related to Robustness
Question
2 answers
We are studying on dynamic teams. We have developed an O2O platform for soccer activity organizing and team management. Teams in the platform are dynamic and boundary-blurring.
We are writing research articles based on our developed platform. While we have a question about robustness testing, that is, we don't know if it is okay to use another period's dataset of the platform to do robustness testing. Is there any research article doing robustness testing in the same way?
Relevant answer
Answer
Ajit Singh Many thanks for your answer. When I reported this research to our college teachers, they raised great doubts about the rationality of this method. This is also why I need to find similar literature to support the feasibility of this method. Unfortunately, I have not found any similar one. In recent revisions, I provided the results of the Two-Sample Kolmogorov-Smirnov test between two datasets, indicating that there are essentially differences between the two datasets.
  • asked a question related to Robustness
Question
7 answers
During the data-collection process, it's common for new insights to arise as a result of interviews conducted. These fresh perspectives can significantly impact the collection process, prompting researchers to circle back to the initial informants and pose additional questions to gain a more comprehensive understanding of the topic at hand. This iterative process ensures that the data collected is rich and nuanced, providing a robust foundation for analysis and interpretation.
Q.
How significant is the role of this data-collection (iterative ) process in facilitating abductive thematic analysis? Kindly provide the references.
Relevant answer
Answer
The iterative process of data collection, where researchers revisit initial informants to gain further insights, plays a crucial role in abductive thematic analysis. Abductive thematic analysis is an approach to data analysis that involves generating themes from data and constantly revising them in light of new information. The iterative process of data collection allows researchers to refine their initial understanding of the data and generate more nuanced and robust themes.
One study that highlights the importance of the iterative process in abductive thematic analysis is "Abductive Analysis: A Promising Methodology for Business Research" by M. Tučková and A. S. Jacková. In this study, the authors emphasize the importance of constantly revising and refining themes in light of new data, and they describe how the iterative process allows for a more comprehensive understanding of the data.
Another study, "Abductive Thematic Network Analysis: Exploring the Role of Positive Psychological Resources in Coping with Work-Related Stress" by M. E. Riva and A. B. Eckert, also emphasizes the importance of the iterative process in abductive thematic analysis. The authors describe how they constantly revisited their data and refined their themes, allowing for a more nuanced and comprehensive understanding of the role of positive psychological resources in coping with work-related stress.
Overall, the iterative process of data collection plays a crucial role in abductive thematic analysis, allowing researchers to refine their understanding of the data and generate more nuanced and robust themes.
  • asked a question related to Robustness
Question
4 answers
Here are my Matlab files for the paper we recently published in IET Control Theory and Applications. The paper is about designing robust controllers for networked systems. It is hoped that these codes will help students to understand how a robust approach would be coded.
Code Matlab a Robust Controller +LMI+Multi-agent systems
Yalmip Toolbox must be added to Matlab.
How to Install a MATLAB toolbox?
After that, you can run the attached codes.
Relevant answer
Answer
Dear Prof. Vodichev, thanks, I will look into your suggested system.
  • asked a question related to Robustness
Question
3 answers
Hi,
I am looking for an equipment that primarily is designed for separation of heterogenous composite system and it must be that robust to tackle asphalt mixture. The equipment can be associated to any lab regardless of its applications to pavement industry.
Note: I am looking for one other than the centrifuge extractor as its not addressing the purpose.
Best,
Gohar
Relevant answer
Answer
Hi
You can use KUMAGAWA
It separate asphalt from aggregate
  • asked a question related to Robustness
Question
1 answer
I'm trying to specify a model with government expenditures and economic growth. My VECM passes all the robustness checks, when variables are included without transformations, but the coefficients are abnormally high, because I'm including GDP per capita data in absilolute values and expenditures data in percentages. If I take all variables in log transformations, then the coefficients make sense, however, the model has serial correlation and heteroskedasticity problems (only in logs). Can I proceed with the VECM model without log transformations?
Relevant answer
Answer
These coefficients show the speed of convergence towards long-run equilibrium. In the short run, the variables can deviate from long-run equilibrium or cointegration relationships. The adjustment coefficients depict how these deviations will be corrected
  • asked a question related to Robustness
Question
3 answers
Access to healthcare is of paramount importance to public health and well-being. Governments have a responsibility to ensure that their citizens have access to quality healthcare in order to maintain a thriving society. Robust public health systems are essential for providing equitable, affordable, and effective healthcare for all. Quality healthcare services must be available to all segments of the population, regardless of income, race, or geography.
Relevant answer
Answer
It is important to uplift the professional status of general practitioners and family doctors into an effective nation-wide network; on this foundation, small outpatients‘ clinics can also be organized.
  • asked a question related to Robustness
Question
2 answers
I am preparing a curriculum for MSc and PhD, I needed a robust curriculum for Event and Leisure, Sustainable Tourism/ Eco-Tourism, MSc Aviation and Tourism Curriculum.
Relevant answer
Answer
Please contact Dr( Mrs) C.E.Ogunlade.
  • asked a question related to Robustness
Question
3 answers
I need any help to make me clear on this topic.
Relevant answer
Answer
You're welcom
  • asked a question related to Robustness
Question
6 answers
I am working on grafting two nanoparticles of various sizes, i.e., one is 10-20nm, while the other is around 100nm. Can anyone please tell me a simple, robust, and efficient grafting technique to decorate smaller nanoparticles on top of the larger nanoparticle?
Relevant answer
Answer
I have found the best solution with some practice. I would like to share it with all.
Experimental:
One possible technique for grafting smaller nanoparticles onto larger nanoparticles is the "seed-mediated growth" approach. Here's a simple protocol that you could follow:
Materials:
  • Large nanoparticles (e.g., 100 nm in diameter)
  • Small nanoparticles (e.g., 10-20 nm in diameter)
  • Cationic surfactant (e.g., cetyltrimethylammonium bromide, CTAB)
  • Reducing agent (e.g., sodium borohydride, NaBH4)
  • Sodium hydroxide (NaOH)
  • Ethanol
Procedure:
  1. Prepare a solution of the cationic surfactant by dissolving it in water to a concentration of 0.1 M.
  2. Add the large nanoparticles to the surfactant solution and stir for 30 minutes to ensure uniform coating of the surfactant on the surface of the large nanoparticles.
  3. Centrifuge the mixture at 10,000 rpm for 15 minutes and discard the supernatant to remove any unbound surfactant.
  4. Prepare a solution of the reducing agent by dissolving it in water to a concentration of 0.1 M.
  5. Add the small nanoparticles to the reducing agent solution and stir for 30 minutes to ensure uniform dispersion of the small nanoparticles.
  6. Add the reducing agent solution to the large nanoparticle solution and stir for 1-2 hours to allow the small nanoparticles to grow on the surface of the large nanoparticles.
  7. Adjust the pH of the mixture to 10-12 by adding NaOH dropwise and stirring for an additional 30 minutes to stabilize the nanoparticles.
  8. Centrifuge the mixture at 10,000 rpm for 15 minutes to collect the nanoparticle precipitate.
  9. Wash the precipitate with ethanol to remove any unbound surfactant and reducing agent.
This seed-mediated growth technique is based on the preferential adsorption of small nanoparticles onto the surfactant-coated surface of larger nanoparticles. The reducing agent then reduces metal ions in solution onto the surface of the small nanoparticles, resulting in their growth. The resulting hybrid nanoparticles can be stabilized by adjusting the pH of the solution.
  • asked a question related to Robustness
Question
1 answer
How can I add the robust confidence ellipses of 97.5% on the variation diagrams (XY ilr-Transformed) in the robcompositions ,or composition packages?
Best
Azzeddine
Relevant answer
Answer
In order for the benefit to prevail, I have verified a group of packages that do the add of The robust confidence ellipses of 97.5%
View them here by package and its function
1- ellipses () using the package 'ellipse'
## ellipses () using the package 'rrcov'
## ellipses () using the package 'cluster'
  • asked a question related to Robustness
Question
2 answers
I am currently interested in carrying out research in the field of image encryption. Most of my work includes 3 stages of encryption, such that the middle stage utilizes a substitution box (S-box). An example is this:
I am looking for a collaborator who can design robust s-boxes and deliver their full analysis. This means computing and commenting on an S-box's performance in terms of:
1. Nonlinearity
2. SAC
3. BIC
4. LAP
5. DAP
If interested, please respond to this discussion or send me a message with your email address.
Relevant answer
Answer
While I am still interested in collaboration in the field of image encryption in general, the specific need mentioned above has been resolved. My team was able to code the various metrics for S-box performance analysis.
  • asked a question related to Robustness
Question
2 answers
I am conducting a research about institutional robustness and social capital in community based managing inland fishery organizations
Here ,I take 02 units of analysis
For institutional robustness- fishery organisation
For social capital I was taken fishers as unit of anlysis
Relevant answer
Answer
are those really units of analysis, or simply variables? What do you mean by unit of analysis?
  • asked a question related to Robustness
Question
3 answers
How do you effectively choose a cell line for murine xenografts? What information is needed to effectively compare several cell lines? Is there a way to predict what cell lines will produce larger tumors faster; or what cancer line will be more robust and proliferate in unideal conditions?
Relevant answer
Answer
Although cancer cell lines are the most widely used starting material as they are readily available and propagated to provide sufficient material for in vitro manipulation and in vivo tumor growth, most of them have been established long time ago and have been selected and cultured under non-physiological conditions.
Whilst they serve as useful tools, there are significant limitations. This is because continual passage of these cell lines is accompanied by extensive clonal selection and consequent loss of heterogeneity. Moreover, different isolates of the same cell line can differ from one another at both the genomic and gene expression levels. Their lack of predictive value is highlighted by the absence of correlation between clinical results and in vitro and in vivo data obtained with cell lines, in part contributing to the >90% failure rate for the development of new oncology drugs.
In contrast, the least manipulated samples are those directly obtained from patients through surgical procedures or needle biopsies. However, one of the major challenges of using primary patient tumors is their limited shelf-life and very low quantity in most cases. Compared with cell line models and patient tissues, patient-derived xenografts (PDXs) provide a practical solution by both preserving the fidelity of clinical characteristics and providing tumor supply sufficient for most target identification and validation strategies.
Another significant benefit of using PDX for target identification and validation is that the process from target identification to validation and then to efficacy screening can be streamlined around the same models, therefore, offering a complete circle from patient to mouse and then back to patient.
Accumulating evidence has indicated that PDX models are superior to traditional cell line xenograft models because they maintain more similarities to the tumors found in actual patients. For example, if you do a detailed cytogenetic analysis of PDX models, it has been revealed that there is a strong preservation of the chromosomal architecture observed in patients. Furthermore, other studies have shown strong fidelity in histology, transcriptome, polymorphism and copy number variations.
So, I would rather use, human tumor samples cultured as cell lines to be implanted into mouse models.
Best.
  • asked a question related to Robustness
Question
7 answers
Hey Members, I'm running quantile regression with panel data using STATA, i find that there are two options :
1- Robust quantile regression for panel data: Standard using (qregpd )
2- Robust quantile regression for panel data with MCMC: using (adaptive Markov chain Monte Carlo)
Can anyone please explain me the use of MCMC ? how can i analyse the output of Robust quantile regression for panel data with MCMC ? thanks
Relevant answer
Answer
Thank you Dr. Mohamed-Mourad
  • asked a question related to Robustness
Question
5 answers
Look into the book
Y. S. Shmaliy and S. Zhao, Optimal and Robust State Estimation: Finite Impulse Response (FIR) and Kalman Approaches, Wiley & Sons, 2022.
This is the first systematic investigation and description of convolution-based (FIR and IIR) state estimation (filtering, smoothing, and prediction) with practical algorithms. In this framework, the Bayesian Kalman filter serves as a recursive computational algorithm for batch optimal FIR and IIF filters. The unbiased FIR filter is shown to be the most robust among other linear estimators. Various robust approaches for disturbed and uncertain systems are also discussed.
Relevant answer
Answer
Yes, I got a copy of your book recently. Very interesting approach!
  • asked a question related to Robustness
Question
1 answer
I know that Yu-Shiba-Rusinov states also show zero-bias peaks and appear in the superconducting gap. My question is how to distinguish them from majorana bound states ? Is Yu-Shiba-Rusinov state also robust against any ferromagnetic barrier (perturbation) ? Because any ferromagnetic barrier at the Normal TI-SC junction suppresses the andreev bound states but the majorana bound state remains unaffected.
Relevant answer
Answer
The Yu-Shiba-Rushinov states are symmetric in energy with respect to the Fermi level (i.e. +E and -E, with E < superconducting gap) The in-gap state depends here by the coupling strength between the electrons and the magnetic impurity. From Majorana bound states we expect (independently by external condition) zero energy. Additionally, Majorana states are protected topologically.
  • asked a question related to Robustness
Question
3 answers
Hello everyone,
in our study, we look at the effects of typical/enhanced body checking on eating pathology before and after the intervention,depending on the level of body concern (see below). Since the assumptions for a 3-way mixed ANOVA are not met, we'd like to conduct the robust alternative using the WRS package on R. There is a function called bwwtrim (function (J, K, L, data, tr=0.2, grp = c(1:p), alpha = 0.05, p=J * K *L) that seems to be what we need. However, I don't find any instruction on how to use it. In what format do the data need to be? How do I need to fill in the formula?
Here are the variables and factors we use:
Independent variables:
1. factor (between-subjects) = Body concern (high/low)
2. factor (within-subjects) = Condition (typical frequency / 3x increased frequency of Body Checking)
3. factor (within-subjects) = Time (pre / post intervention)
Dependent variable:
eg. Score on the "Drive for Thinness" questionnaire (Eating pathology)
For our 2-way mixed ANOVA (bwtrim function of the WRS2 package), I found a very helpful summary by the authors themselves (Robust Statistical Methods Using WRS2, Mair & Wilcox, https://rdrr.io/cran/WRS2/#vignettes). Is there something similar for the bwwtrim function?
Thank you a lot for your help!
All the best,
Hannah Bauer
Relevant answer
Answer
  • asked a question related to Robustness
Question
1 answer
I have noticed that there are single microscopic slide/slip chambers (Cytodyne, Flexflow, IBIDI) and many studies have used these chambers. I wondered how it is possible to have more robust data by using a single fluid flow chamber (1 replicate) and a control?
Relevant answer
Answer
Hi Mustafa,
our tech support team is happy to help with your question but would need a bit more info on your research question, experimental setup, etc. Please get in touch via Email: techsupport@ibidi.com
  • asked a question related to Robustness
Question
1 answer
Dear Researchers,
I am looking for a research paper that is published in a good journal and confirms the reliability of using NASA-POWER data in hydro-climatic studies.
Best wishes,
Mohammed
Relevant answer
Answer
  • asked a question related to Robustness
Question
14 answers
Hello all! As part of my master's thesis, I did an experiment. I now have 5 groups with n=45 participants each. When I look at the data for the manipulation checks, they are not normally distributed. In theory, however, an ANOVA needs normally distributed data. I know that an ANOVA is a robust instrument and that I don't have to worry about it with my group size. But now to my question: Do I in fact need normally distributed data at all for a manipulation check measure or is it not in the nature of the question that the data is skewed? E.g. If I want to know if a Gain-Manipulation worked, do i not want to have data skewed either to the left (or right - depending on the scale)?
Would be great if somebody could give me feedback on that!
Best
Carina
Relevant answer
Answer
I want to emphasize Daniel Wright's comment about assumptions applying to the populations from which you sampled. Textbooks often present the F-test for one-way ANOVA as if it is an exact test. But in order for it to be an exact test, you would need to have random samples from k populations that are perfectly normally distributed with exactly equal variances. (In addition to that, each observation would have to be perfectly independent of all other observations.) Even if it was possible to meet those conditions (which it is not if you are working with real data), the samples would not be perfectly normal, and would not have exactly equal sample variances.
Because it is not possible to meet the conditions described above (at least when you are working with real data, not simulated data), the F-test for ANOVA is really an approximate test. And when you are using an approximate test, the real question is whether the approximation is good enough to be useful.* That's how I see it. YMMV. ;-)
* Yes, I am borrowing "useful" from George Box's famous statement(s) about all models being wrong, but some being "useful". Several variations on that statement can be found here:
  • asked a question related to Robustness
Question
11 answers
I'm trying to do a robust one-way ANOVA to compare whether there's an effect of my vignette on my dependent variable (learn_c). I need to use the robust version because I don't have equal variances across groups.
When I run a turkey-test on my regular anova, the contrasts seem to make sense.
There "growth" and "placebo" conditions are not significantly different, but both of them are significantly different from "fixed".
However, when I run it using the robust method (using WRS2 package in R), it seems to "misread" the labels and runs the contrasts differently. Now it insists that "growth" and "fixed" are not different but "placebo" and "growth" are.
Does the WRS2 order something differently? Or am I misunderstanding what it does?
Relevant answer
Answer
Not only are the SDs for the 3 groups fairly similar, the sample sizes do not vary all that much either. I mention that, because ANOVA is very robust to heterogeneity of variance when all sample sizes are the same, or nearly so. In any case, given those descriptive stats, I would not be at all uncomfortable with Welch's F-test and a multiple comparison method that is designed for unequal variances--e.g., Games-Howell. You may find this simulation study helpful:
Sauder, D. C., & DeMars, C. E. (2019). An updated recommendation for multiple comparisons. Advances in Methods and Practices in Psychological Science, 2(1), 26-44.
HTH.
  • asked a question related to Robustness
Question
3 answers
I am working on time series about COVID-19 Data interested in the subject
"Robust Forecasting with Exponential and Holt -Winter " and compute all my results in R packages, therefor I need help in:-
1- any new paper in this filed
2- Codes of Holt-winter smoothing in r
Thank you for any help
With Best Wishes
Relevant answer
Answer
Take a look at what really is done
David Booth
  • asked a question related to Robustness
Question
3 answers
Currently I am studying VAR methodologies in the hope of constructing a model for a future project, and have a fairly limited understanding of the necessary criteria to be met to generate robust results. My readings of recent literature make few mentions of residual diagnostics, specifically the joint normality of the residuals.
I have found through trial and error that a small number of exogenous spike ( blip ) dummy variables at key dates such as financial crises or policy changes, through visual inspection of residuals, have corrected the non-normality issue, I have found almost no evidence of similar studies doing the same, which leads me to wonder whether such measures are in fact misspecifications.
Tests for co-integration, in my case the Johansen test, give warnings (Eviews 12) against adding any exogenous variables, so as not to invalidate critical values. However, lag selection criteria for a VAR model differs when correcting residual non-normality with exogenous dummies. My current understanding of creating a VEC model is that, as a preliminary measure, both lag length selection and co-integration tests are performed on the model in levels, assuming all series are I(1) processes. Therefore, my question is whether one should:
1) Perform a cointegration test without dummies and select the lag length with dummies.
2) Abandon the dummies altogether, and subsequently violate the normality assumption.
3) Perform both Lag length selection and co-integration testing on the VAR in levels, then add dummies to the VECM.
My intention is to follow the common empirical approach in analysing both IFR and VDC of the VECM model (assuming there is cointegration), should this have any bearing on the matter. My understanding is that normality impacts only the validity of hypothesis testing however, in my reading, I have found no evidence to suggest that IRF and VDC standard errors are robust to non-normality.
many thanks,
Andrew Slaven
(Undergraduate Student at Aberystwyth University)
Relevant answer
Answer
I don't think that there is an easy answer to your question. The problems you discuss are covered in Juselius (2006), The Cointegrated VAR Model, Oxford. When I last used Eviews their Johansen routines would not have covered all the routines in this book. That was some time ago and things may have changed. Cats in Rats (estima.com) or the equivalent in Oxmetrics were better. For a more modern survey, you might look at Killian and LUtkepohl, Structural Vector Autoregressive Analysis, Oxford. These are graduate-level tests and are not easygoing.
  • asked a question related to Robustness
Question
3 answers
We know that in space the compact size and light weight are key design features. The compact size can sometimes be constraint for high antenna performance. Deployable Origami antennas can be a good candidate to solve this problem. But is it robust enough to work in space environment.
Relevant answer
Answer
You might read my paper:
Origami based ultraviolet C device for low cost portable disinfection- using a parametric approach to design
Thanks.
  • asked a question related to Robustness
Question
6 answers
the small-signal stability of a two-area power system with and without DFIG
Relevant answer
Answer
  • asked a question related to Robustness
Question
2 answers
In robust optimization, random variables are modeled as uncertain parameters belonging to a convex uncertainty set and the decision-maker protects the system against the worst case within that set.
In the context of nonlinear multi-stage max-min robust optimization problems:
What are the best robustness models such as Strict robustness, Cardinality constrained robustness, Adjustable robustness, Light robustness, Regret robustness, and Recoverable robustness?
How to solve max-min robust optimization problems without linearization/approximations efficiently? Algorithms?
How to approach nested robust optimization problems?
For example, the problem can be security-constrained AC optimal power flow.
Relevant answer
Answer
To tractably reformulate robust nonlinear constraints, you can use the Fenchel duality scheme proposed by Ben Tal, Hertog and Vial in
"Deriving Robust Counterparts of Nonlinear Uncertain Inequalities"
Also, you can use Affine Decision Rules to deal with the multi-stage decision making structure. Check for example: "Optimality of Affine Policies in Multistage Robust Optimization" by Bertsimas, Iancu and Parrilo.
  • asked a question related to Robustness
Question
1 answer
As it is explained that exosome are robust in nature as they can withstand pH change or temperature change and can be stable in various buffer, so can we suspend exosome in pure water or distilled water and if we do, does its affects the markers present on it and if it does what are that changes occurs?
Relevant answer
Answer
Structural stability in different pHs doesn't mean that the exosomes will be stable enough and constant in hypotonic conditions (e.g pure distilled water). Various reactions and changes would be expected.
  • asked a question related to Robustness
Question
8 answers
Hi, I'm currently searching for a rigorous approach to show that my estimated regression coefficients are robust to sampling procedures.
I have performed a fixed-effect IV regression on a full sample and obtained coefficients. I need to show that my regression coefficients are invariant or robust to different subsets of the sample. How can I test to show that my coefficients are invariant to say the coefficients from a regression after dropping 5, 10, 15, 20%...% of the sample?
Relevant answer
Answer
Moonwon Chung Thank you for providing additional context. This sounds like you would be interested in comparisons across fixed/specific groups of companies rather than a random selection of companies. That is, do you want to formally compare the regression coefficients across companies with, for example, three versus four products (or different ranges, say 3 to 5 versus 6 to 8 products etc.)? If that is the case, then multigroup regression analysis would allow you to formally test whether the regression coefficients differ significantly across those (independent) groups. You don't need to turn your model into an SEM with latent variables for that. All you need is a program for SEM that allows you to run multigroup analysis (most SEM programs do). You can then specify your regression model as a multigroup model and test whether the coefficients differ significantly across groups.
  • asked a question related to Robustness
Question
4 answers
I am conducting a research on quality in higher education by using system dynamics approach.
How can I determine the number and components of improvement scenarios in the future.
Are there robust selection criteria of number or components of scenarios ?
Are there references for identifying the number and components of scenarios?
Thank you so much
Relevant answer
Answer
System dynamics models help you to understand the interrelationships between variables. System dynamics can be an effective tool in a variety of settings. It lets you develop a fairly complex system model. You can also, for example, look at the interactions of feedback loops to help see how a system reacts over time. You asked about how to determine the number and components of improvement scenarios, robust selection criteria, and identifying the number and components of scenarios. Your questions are good, but might be premature.
Since you are doing a study, you need to first start with a research question/hypothesis. That will suggest the best methodology to use. Your method and research design will help you to answer the questions you asked. The professional literature can help you gain a better focus while showing you what has been done and what research is needed.
  • asked a question related to Robustness
Question
1 answer
Natural Language Processing
Relevant answer
Answer
Sketch Engine is quite robust
  • asked a question related to Robustness
Question
3 answers
In case of not, what is the non parametric test could be used?
Relevant answer
Answer
Try Aligned Rank Transform (ART) method via the R
package ARTool proposed by Wobbrock et al. (2011) to prepare data for a non-parametric
ANOVA. There is also a non-R based version as well.
Wobbrock, J. O., Findlater, L., Gergle, D., & Higgins, J. J. (2011, May). The aligned rank
transform for nonparametric factorial analyses using only anova procedures.
In Proceedings of the SIGCHI conference on human factors in computing systems (pp.
143-146).
  • asked a question related to Robustness
Question
1 answer
How do I check the Endogenity in AMOS if i have one IV (Knowledge sharing) and one DV (Performance). And how do I check the robustness if I have Knowledge sharing as IV and Performance as DV and Gender as a moderator.
Relevant answer
Answer
there are various methods to do this in AMOS or SPSS
check this link:
Also this video explains the concept very well which might be helpful for you:
test whether coefficient on your regression is significant. If it is, conclude that X and error term are indeed correlated; there is endogeneity.
  • asked a question related to Robustness
Question
4 answers
This is a very important question, because I have not found a validated scale so far that can be considered as the most robust/reliable
Relevant answer
Answer
Congratulations colleagues. Thanks for the question. An interesting topic. But we have a war... First the victory, then the assessment of losses and reconstruction… And branding later.
  • asked a question related to Robustness
Question
3 answers
I mean something that could do a work equvalent to what MAXQDA or Atlasti do?
Relevant answer
Answer
Versions of this question have been asked here several times, so I suggest that use the Search function (at the top of the page) to locate those answers.
  • asked a question related to Robustness
Question
1 answer
I read some articles about statistical robustness of SmartPLS. However, I am not sure about the appropriateness of SmartPLS in the case of survey study involving a representative sample with adequate sample size. Any suggestions?
Thank you!
Relevant answer
Answer
It's my understanding that PLS works better with smaller samples. But, let's hear it from the experts.
  • asked a question related to Robustness
Question
2 answers
Dears,
Do we need exploit further the genetic robustness aroused from distant hybridization? Mule is an excellent example in this regard.
Relevant answer
Answer
Good luck.