Search results for: accidents predictions
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1097

Search results for: accidents predictions

17 Fuzzy Multi-Objective Approach for Emergency Location Transportation Problem

Authors: Bidzina Matsaberidze, Anna Sikharulidze, Gia Sirbiladze, Bezhan Ghvaberidze

Abstract:

In the modern world emergency management decision support systems are actively used by state organizations, which are interested in extreme and abnormal processes and provide optimal and safe management of supply needed for the civil and military facilities in geographical areas, affected by disasters, earthquakes, fires and other accidents, weapons of mass destruction, terrorist attacks, etc. Obviously, these kinds of extreme events cause significant losses and damages to the infrastructure. In such cases, usage of intelligent support technologies is very important for quick and optimal location-transportation of emergency service in order to avoid new losses caused by these events. Timely servicing from emergency service centers to the affected disaster regions (response phase) is a key task of the emergency management system. Scientific research of this field takes the important place in decision-making problems. Our goal was to create an expert knowledge-based intelligent support system, which will serve as an assistant tool to provide optimal solutions for the above-mentioned problem. The inputs to the mathematical model of the system are objective data, as well as expert evaluations. The outputs of the system are solutions for Fuzzy Multi-Objective Emergency Location-Transportation Problem (FMOELTP) for disasters’ regions. The development and testing of the Intelligent Support System were done on the example of an experimental disaster region (for some geographical zone of Georgia) which was generated using a simulation modeling. Four objectives are considered in our model. The first objective is to minimize an expectation of total transportation duration of needed products. The second objective is to minimize the total selection unreliability index of opened humanitarian aid distribution centers (HADCs). The third objective minimizes the number of agents needed to operate the opened HADCs. The fourth objective minimizes the non-covered demand for all demand points. Possibility chance constraints and objective constraints were constructed based on objective-subjective data. The FMOELTP was constructed in a static and fuzzy environment since the decisions to be made are taken immediately after the disaster (during few hours) with the information available at that moment. It is assumed that the requests for products are estimated by homeland security organizations, or their experts, based upon their experience and their evaluation of the disaster’s seriousness. Estimated transportation times are considered to take into account routing access difficulty of the region and the infrastructure conditions. We propose an epsilon-constraint method for finding the exact solutions for the problem. It is proved that this approach generates the exact Pareto front of the multi-objective location-transportation problem addressed. Sometimes for large dimensions of the problem, the exact method requires long computing times. Thus, we propose an approximate method that imposes a number of stopping criteria on the exact method. For large dimensions of the FMOELTP the Estimation of Distribution Algorithm’s (EDA) approach is developed.

Keywords: epsilon-constraint method, estimation of distribution algorithm, fuzzy multi-objective combinatorial programming problem, fuzzy multi-objective emergency location/transportation problem

Procedia PDF Downloads 294
16 The Future of Adventure Tourism in a Warmer World: An Exploratory Study of Mountain Guides’ Perception of Environmental Change in Canada

Authors: Brooklyn Rushton, Michelle Rutty, Natalie Knowles, Daniel Scott

Abstract:

As people are increasingly on the search for extraordinary experiences and connections with nature, adventure tourism is experiencing significant growth and providing tourists with life-changing experiences. Unlike built attraction-based tourism, adventure tourism relies entirely on natural heritage, which leaves communities dependent on adventure tourism extremely vulnerable to environmental and climatic changes. A growing body of evidence suggests that global climate change will influence the future of adventure tourism and mountain outdoor recreation opportunities on a global scale. Across Canada, more specifically, climate change is broadly anticipated to present risks for winter-snow sports, while opportunities are anticipated to arise for green season activities. These broad seasonal shifts do not account for the indirect impacts of climate change on adventure tourism, such as the cost of adaptation or the increase of natural hazards and the associated likelihood of accidents. While some research has examined the impact of climate change on natural environments that adventure tourism relies on, a very small body of research has specifically focused on guides’ perspectives or included hard adventure tourism activities. The guiding industry is unique, as guides are trained through an elegant blend of art and science to make decisions based on experience, observation, and intuition. While quantitative research can monitor change in natural environments, guides local knowledge can provide eye-witness accounts and outline what environmental changes mean for the future sustainability of adventure tourism. This research will capture the extensive knowledge of mountain guides to better understand the implications of climate change for mountain adventure and potential adaptive responses for the adventure tourism industry. This study uses a structured online survey with open and close-ended questions that will be administered using Qualtrics (an online survey platform). This survey is disseminated to current members of the Association of Canadian Mountain Guides (ACMG). Participation in this study will be exclusive to members of the ACMG operating in the outdoor guiding streams. The 25 survey questions are organized into four sections: demographic and professional operation (9 questions), physical change (4 questions), climate change perception (6 questions), and climate change adaptation (6 questions). How mountain guides perceive and respond to climate change is important knowledge for the future of the expanding adventure tourism industry. Results from this study are expected to provide important information to mountain destinations on climate change vulnerability and adaptive capacity. Expected results of this study include guides insight into: (1) experience-safety relevant observed physical changes in guided regions (i.e. glacial coverage, permafrost coverage, precipitation, temperature, and slope instability) (2) changes in hazards within the guiding environment (i.e. avalanches, rockfall, icefall, forest fires, flooding, and extreme weather events), (3) existing and potential adaptation strategies, and (4) key information and other barriers for adaptation. By gaining insight from the knowledge of mountain guides, this research can help the tourism industry at large understand climate risk and create adaptation strategies to ensure the resiliency of the adventure tourism industry.

Keywords: adventure tourism, climate change, environmental change, mountain hazards

Procedia PDF Downloads 166
15 Effects of Irrigation Applications during Post-Anthesis Period on Flower Development and Pyrethrin Accumulation in Pyrethrum

Authors: Dilnee D. Suraweera, Tim Groom, Brian Chung, Brendan Bond, Andrew Schipp, Marc E. Nicolas

Abstract:

Pyrethrum (Tanacetum cinerariifolium) is a perennial plant belongs to family Asteraceae. This is cultivated commercially for extraction of natural insecticide pyrethrins, which accumulates in their flower head achenes. Approximately 94% of the pyrethrins are produced within secretory ducts and trichomes of achenes of the mature pyrethrum flower. This is the most widely used botanical insecticide in the world and Australia is the current largest pyrethrum producer in the world. Rainfall in pyrethrum growing regions in Australia during pyrethrum flowering period, in late spring and early summer is significantly less. Due to lack of adequate soil moisture and under elevated temperature conditions during post-anthesis period, resulting in yield reductions. Therefore, understanding of yield responses of pyrethrum to irrigation is important for Pyrethrum as a commercial crop. Irrigation management has been identified as a key area of pyrethrum crop management strategies that could be manipulated to increase yield. Pyrethrum is a comparatively drought tolerant plant and it has some ability to survive in dry conditions due to deep rooting. But in dry areas and in dry seasons, the crop cannot reach to its full yield potential without adequate soil moisture. Therefore, irrigation is essential during the flowering period prevent crop water stress and maximise yield. Irrigation during the water deficit period results in an overall increased rate of water uptake and growth by the plant which is essential to achieve the maximum yield benefits from commercial crops. The effects of irrigation treatments applied at post-anthesis period on pyrethrum yield responses were studied in two irrigation methods. This was conducted in a first harvest commercial pyrethrum field in Waubra, Victoria, during 2012/2013 season. Drip irrigation and overhead sprinkler irrigation treatments applied during whole flowering period were compared with ‘rainfed’ treatment in relation to flower yield and pyrethrin yield responses. The results of this experiment showed that the application of 180mm of irrigation throughout the post-anthesis period, from early flowering stages to physiological maturity under drip irrigation treatment increased pyrethrin concentration by 32%, which combined with the 95 % increase in the flower yield to give a total pyrethrin yield increase of 157%, compared to the ‘rainfed’ treatment. In contrast to that overhead sprinkler irrigation treatment increased pyrethrin concentration by 19%, which combined with the 60 % increase in the flower yield to give a total pyrethrin yield increase of 91%, compared to the ‘rainfed’ treatment. Irrigation treatments applied throughout the post-anthesis period significantly increased flower yield as a result of enhancement of number of flowers and flower size. Irrigation provides adequate soil moisture for flower development in pyrethrum which slows the rate of flower development and increases the length of the flowering period, resulting in a delayed crop harvest (11 days) compared to the ‘rainfed’ treatment. Overall, irrigation has a major impact on pyrethrin accumulation which increases the rate and duration of pyrethrin accumulation resulting in higher pyrethrin yield per flower at physiological maturity. The findings of this study will be important for future yield predictions and to develop advanced agronomic strategies to maximise pyrethrin yield in pyrethrum.

Keywords: achene, drip irrigation, overhead irrigation, pyrethrin

Procedia PDF Downloads 382
14 Developing a Machine Learning-based Cost Prediction Model for Construction Projects using Particle Swarm Optimization

Authors: Soheila Sadeghi

Abstract:

Accurate cost prediction is essential for effective project management and decision-making in the construction industry. This study aims to develop a cost prediction model for construction projects using Machine Learning techniques and Particle Swarm Optimization (PSO). The research utilizes a comprehensive dataset containing project cost estimates, actual costs, resource details, and project performance metrics from a road reconstruction project. The methodology involves data preprocessing, feature selection, and the development of an Artificial Neural Network (ANN) model optimized using PSO. The study investigates the impact of various input features, including cost estimates, resource allocation, and project progress, on the accuracy of cost predictions. The performance of the optimized ANN model is evaluated using metrics such as Mean Squared Error (MSE), Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and R-squared. The results demonstrate the effectiveness of the proposed approach in predicting project costs, outperforming traditional benchmark models. The feature selection process identifies the most influential variables contributing to cost variations, providing valuable insights for project managers. However, this study has several limitations. Firstly, the model's performance may be influenced by the quality and quantity of the dataset used. A larger and more diverse dataset covering different types of construction projects would enhance the model's generalizability. Secondly, the study focuses on a specific optimization technique (PSO) and a single Machine Learning algorithm (ANN). Exploring other optimization methods and comparing the performance of various ML algorithms could provide a more comprehensive understanding of the cost prediction problem. Future research should focus on several key areas. Firstly, expanding the dataset to include a wider range of construction projects, such as residential buildings, commercial complexes, and infrastructure projects, would improve the model's applicability. Secondly, investigating the integration of additional data sources, such as economic indicators, weather data, and supplier information, could enhance the predictive power of the model. Thirdly, exploring the potential of ensemble learning techniques, which combine multiple ML algorithms, may further improve cost prediction accuracy. Additionally, developing user-friendly interfaces and tools to facilitate the adoption of the proposed cost prediction model in real-world construction projects would be a valuable contribution to the industry. The findings of this study have significant implications for construction project management, enabling proactive cost estimation, resource allocation, budget planning, and risk assessment, ultimately leading to improved project performance and cost control. This research contributes to the advancement of cost prediction techniques in the construction industry and highlights the potential of Machine Learning and PSO in addressing this critical challenge. However, further research is needed to address the limitations and explore the identified future research directions to fully realize the potential of ML-based cost prediction models in the construction domain.

Keywords: cost prediction, construction projects, machine learning, artificial neural networks, particle swarm optimization, project management, feature selection, road reconstruction

Procedia PDF Downloads 12
13 Effect of Velocity-Slip in Nanoscale Electroosmotic Flows: Molecular and Continuum Transport Perspectives

Authors: Alper T. Celebi, Ali Beskok

Abstract:

Electroosmotic (EO) slip flows in nanochannels are investigated using non-equilibrium molecular dynamics (MD) simulations, and the results are compared with analytical solution of Poisson-Boltzmann and Stokes (PB-S) equations with slip contribution. The ultimate objective of this study is to show that well-known continuum flow model can accurately predict the EO velocity profiles in nanochannels using the slip lengths and apparent viscosities obtained from force-driven flow simulations performed at various liquid-wall interaction strengths. EO flow of aqueous NaCl solution in silicon nanochannels are simulated under realistic electrochemical conditions within the validity region of Poisson-Boltzmann theory. A physical surface charge density is determined for nanochannels based on dissociations of silanol functional groups on channel surfaces at known salt concentration, temperature and local pH. First, we present results of density profiles and ion distributions by equilibrium MD simulations, ensuring that the desired thermodynamic state and ionic conditions are satisfied. Next, force-driven nanochannel flow simulations are performed to predict the apparent viscosity of ionic solution between charged surfaces and slip lengths. Parabolic velocity profiles obtained from force-driven flow simulations are fitted to a second-order polynomial equation, where viscosity and slip lengths are quantified by comparing the coefficients of the fitted equation with continuum flow model. Presence of charged surface increases the viscosity of ionic solution while the velocity-slip at wall decreases. Afterwards, EO flow simulations are carried out under uniform electric field for different liquid-wall interaction strengths. Velocity profiles present finite slips near walls, followed with a conventional viscous flow profile in the electrical double layer that reaches a bulk flow region in the center of the channel. The EO flow enhances with increased slip at the walls, which depends on wall-liquid interaction strength and the surface charge. MD velocity profiles are compared with the predictions from analytical solutions of the slip modified PB-S equation, where the slip length and apparent viscosity values are obtained from force-driven flow simulations in charged silicon nano-channels. Our MD results show good agreements with the analytical solutions at various slip conditions, verifying the validity of PB-S equation in nanochannels as small as 3.5 nm. In addition, the continuum model normalizes slip length with the Debye length instead of the channel height, which implies that enhancement in EO flows is independent of the channel height. Further MD simulations performed at different channel heights also shows that the flow enhancement due to slip is independent of the channel height. This is important because slip enhanced EO flow is observable even in micro-channels experiments by using a hydrophobic channel with large slip and high conductivity solutions with small Debye length. The present study provides an advanced understanding of EO flows in nanochannels. Correct characterization of nanoscale EO slip flow is crucial to discover the extent of well-known continuum models, which is required for various applications spanning from ion separation to drug delivery and bio-fluidic analysis.

Keywords: electroosmotic flow, molecular dynamics, slip length, velocity-slip

Procedia PDF Downloads 122
12 Modelling Spatial Dynamics of Terrorism

Authors: André Python

Abstract:

To this day, terrorism persists as a worldwide threat, exemplified by the recent deadly attacks in January 2015 in Paris and the ongoing massacres perpetrated by ISIS in Iraq and Syria. In response to this threat, states deploy various counterterrorism measures, the cost of which could be reduced through effective preventive measures. In order to increase the efficiency of preventive measures, policy-makers may benefit from accurate predictive models that are able to capture the complex spatial dynamics of terrorism occurring at a local scale. Despite empirical research carried out at country-level that has confirmed theories explaining the diffusion processes of terrorism across space and time, scholars have failed to assess diffusion’s theories on a local scale. Moreover, since scholars have not made the most of recent statistical modelling approaches, they have been unable to build up predictive models accurate in both space and time. In an effort to address these shortcomings, this research suggests a novel approach to systematically assess the theories of terrorism’s diffusion on a local scale and provide a predictive model of the local spatial dynamics of terrorism worldwide. With a focus on the lethal terrorist events that occurred after 9/11, this paper addresses the following question: why and how does lethal terrorism diffuse in space and time? Based on geolocalised data on worldwide terrorist attacks and covariates gathered from 2002 to 2013, a binomial spatio-temporal point process is used to model the probability of terrorist attacks on a sphere (the world), the surface of which is discretised in the form of Delaunay triangles and refined in areas of specific interest. Within a Bayesian framework, the model is fitted through an integrated nested Laplace approximation - a recent fitting approach that computes fast and accurate estimates of posterior marginals. Hence, for each location in the world, the model provides a probability of encountering a lethal terrorist attack and measures of volatility, which inform on the model’s predictability. Diffusion processes are visualised through interactive maps that highlight space-time variations in the probability and volatility of encountering a lethal attack from 2002 to 2013. Based on the previous twelve years of observation, the location and lethality of terrorist events in 2014 are statistically accurately predicted. Throughout the global scope of this research, local diffusion processes such as escalation and relocation are systematically examined: the former process describes an expansion from high concentration areas of lethal terrorist events (hotspots) to neighbouring areas, while the latter is characterised by changes in the location of hotspots. By controlling for the effect of geographical, economical and demographic variables, the results of the model suggest that the diffusion processes of lethal terrorism are jointly driven by contagious and non-contagious factors that operate on a local scale – as predicted by theories of diffusion. Moreover, by providing a quantitative measure of predictability, the model prevents policy-makers from making decisions based on highly uncertain predictions. Ultimately, this research may provide important complementary tools to enhance the efficiency of policies that aim to prevent and combat terrorism.

Keywords: diffusion process, terrorism, spatial dynamics, spatio-temporal modeling

Procedia PDF Downloads 327
11 Ensemble Methods in Machine Learning: An Algorithmic Approach to Derive Distinctive Behaviors of Criminal Activity Applied to the Poaching Domain

Authors: Zachary Blanks, Solomon Sonya

Abstract:

Poaching presents a serious threat to endangered animal species, environment conservations, and human life. Additionally, some poaching activity has even been linked to supplying funds to support terrorist networks elsewhere around the world. Consequently, agencies dedicated to protecting wildlife habitats have a near intractable task of adequately patrolling an entire area (spanning several thousand kilometers) given limited resources, funds, and personnel at their disposal. Thus, agencies need predictive tools that are both high-performing and easily implementable by the user to help in learning how the significant features (e.g. animal population densities, topography, behavior patterns of the criminals within the area, etc) interact with each other in hopes of abating poaching. This research develops a classification model using machine learning algorithms to aid in forecasting future attacks that is both easy to train and performs well when compared to other models. In this research, we demonstrate how data imputation methods (specifically predictive mean matching, gradient boosting, and random forest multiple imputation) can be applied to analyze data and create significant predictions across a varied data set. Specifically, we apply these methods to improve the accuracy of adopted prediction models (Logistic Regression, Support Vector Machine, etc). Finally, we assess the performance of the model and the accuracy of our data imputation methods by learning on a real-world data set constituting four years of imputed data and testing on one year of non-imputed data. This paper provides three main contributions. First, we extend work done by the Teamcore and CREATE (Center for Risk and Economic Analysis of Terrorism Events) research group at the University of Southern California (USC) working in conjunction with the Department of Homeland Security to apply game theory and machine learning algorithms to develop more efficient ways of reducing poaching. This research introduces ensemble methods (Random Forests and Stochastic Gradient Boosting) and applies it to real-world poaching data gathered from the Ugandan rain forest park rangers. Next, we consider the effect of data imputation on both the performance of various algorithms and the general accuracy of the method itself when applied to a dependent variable where a large number of observations are missing. Third, we provide an alternate approach to predict the probability of observing poaching both by season and by month. The results from this research are very promising. We conclude that by using Stochastic Gradient Boosting to predict observations for non-commercial poaching by season, we are able to produce statistically equivalent results while being orders of magnitude faster in computation time and complexity. Additionally, when predicting potential poaching incidents by individual month vice entire seasons, boosting techniques produce a mean area under the curve increase of approximately 3% relative to previous prediction schedules by entire seasons.

Keywords: ensemble methods, imputation, machine learning, random forests, statistical analysis, stochastic gradient boosting, wildlife protection

Procedia PDF Downloads 265
10 Auto Rickshaw Impacts with Pedestrians: A Computational Analysis of Post-Collision Kinematics and Injury Mechanics

Authors: A. J. Al-Graitti, G. A. Khalid, P. Berthelson, A. Mason-Jones, R. Prabhu, M. D. Jones

Abstract:

Motor vehicle related pedestrian road traffic collisions are a major road safety challenge, since they are a leading cause of death and serious injury worldwide, contributing to a third of the global disease burden. The auto rickshaw, which is a common form of urban transport in many developing countries, plays a major transport role, both as a vehicle for hire and for private use. The most common auto rickshaws are quite unlike ‘typical’ four-wheel motor vehicle, being typically characterised by three wheels, a non-tilting sheet-metal body or open frame construction, a canvas roof and side curtains, a small drivers’ cabin, handlebar controls and a passenger space at the rear. Given the propensity, in developing countries, for auto rickshaws to be used in mixed cityscapes, where pedestrians and vehicles share the roadway, the potential for auto rickshaw impacts with pedestrians is relatively high. Whilst auto rickshaws are used in some Western countries, their limited number and spatial separation from pedestrian walkways, as a result of city planning, has not resulted in significant accident statistics. Thus, auto rickshaws have not been subject to the vehicle impact related pedestrian crash kinematic analyses and/or injury mechanics assessment, typically associated with motor vehicle development in Western Europe, North America and Japan. This study presents a parametric analysis of auto rickshaw related pedestrian impacts by computational simulation, using a Finite Element model of an auto rickshaw and an LS-DYNA 50th percentile male Hybrid III Anthropometric Test Device (dummy). Parametric variables include auto rickshaw impact velocity, auto rickshaw impact region (front, centre or offset) and relative pedestrian impact position (front, side and rear). The output data of each impact simulation was correlated against reported injury metrics, Head Injury Criterion (front, side and rear), Neck injury Criterion (front, side and rear), Abbreviated Injury Scale and reported risk level and adds greater understanding to the issue of auto rickshaw related pedestrian injury risk. The parametric analyses suggest that pedestrians are subject to a relatively high risk of injury during impacts with an auto rickshaw at velocities of 20 km/h or greater, which during some of the impact simulations may even risk fatalities. The present study provides valuable evidence for informing a series of recommendations and guidelines for making the auto rickshaw safer during collisions with pedestrians. Whilst it is acknowledged that the present research findings are based in the field of safety engineering and may over represent injury risk, compared to “Real World” accidents, many of the simulated interactions produced injury response values significantly greater than current threshold curves and thus, justify their inclusion in the study. To reduce the injury risk level and increase the safety of the auto rickshaw, there should be a reduction in the velocity of the auto rickshaw and, or, consideration of engineering solutions, such as retro fitting injury mitigation technologies to those auto rickshaw contact regions which are the subject of the greatest risk of producing pedestrian injury.

Keywords: auto rickshaw, finite element analysis, injury risk level, LS-DYNA, pedestrian impact

Procedia PDF Downloads 169
9 Religion and Risk: Unmasking Noah's Narratives in the Pacific Islands

Authors: A. Kolendo

Abstract:

Pacific Islands are one of the most vulnerable areas to climate change. Sea level rise and accelerating storm surge continuously threaten the communities' habitats on low-lying atolls. With scientific predictions of encroaching tides on their land, the Islanders have been informed about the need for future relocation planning. However, some communities oppose such retreat strategies through the reasoning that comprehends current climatic changes through the lenses of the biblical ark of Noah. This parable states God's promise never to flood the Earth again and never deprive people of their land and habitats. Several interpretations of this parable emerged in Oceania, prompting either climate action or denial. Resistance to relocation planning expressed through Christian thoughts led religion to be perceived as a barrier to dialogue between the Islanders and scientists. Since climate change concerns natural processes, the attitudes towards environmental stewardship prompt the communities' responses to it; some Christian teachings indicate humanity's responsibility over the environment, whereas others ascertain the people's dominion, which prompts resistance and sometimes denial. With church denominations and their various environmental standpoints, competing responses to climate change emerged in Oceania. Before miss-ionization, traditional knowledge had guided the environmental sphere, influencing current Christian teachings. Each atoll characterizes a distinctive manner of traditional knowledge; however, the unique relationship with nature unites all islands. The interconnectedness between the land, sea and people indicates the integrity between the communities and their environments. Such a factor influences the comprehension of Noah's story in the context of climate change that threatens their habitats. Pacific Islanders experience climate change through the slow disappearance of their homelands. However, the Western world perceives it as a global issue that will affect the population in the long-term perspective. Therefore, the Islanders seek to comprehend this global phenomenon in a local context that reads climate change as the Great Deluge. Accordingly, the safety measures that this parable promotes compensate for the danger of climate change. The rainbow covenant gives hope in God's promise never to flood the Earth again. At the same time, Noah's survival relates to the Islanders' current situation. Since these communities have the lowest carbon emissions rate, their contribution to anthropogenic climate change is scarce. Therefore, the lack of environmental sin would contextualize them as contemporary Noah with the ultimate survival of sea level rise. This study aims to defy religion constituting a barrier through secondary data analysis from a risk compensation perspective. Instead, religion is portrayed as a source of knowledge that enables comprehension of the communities' situation. By demonstrating that the Pacific Islanders utilize Noah's story as a vessel for coping with the danger of climate change, the study argues that religion provides safety measures that compensate for the future projections of land's disappearance. The purpose is to build a bridge between religious communities and scientific bodies and ultimately bring an understanding of two diverse perspectives. By addressing the practical challenges of interdisciplinary research with faith-based systems, this study uplifts the voices of communities and portrays their experiences expressed through Christian thoughts.

Keywords: Christianity, climate change, existential threat, Pacific Islands, story of Noah

Procedia PDF Downloads 64
8 An Integrated Real-Time Hydrodynamic and Coastal Risk Assessment Model

Authors: M. Reza Hashemi, Chris Small, Scott Hayward

Abstract:

The Northeast Coast of the US faces damaging effects of coastal flooding and winds due to Atlantic tropical and extratropical storms each year. Historically, several large storm events have produced substantial levels of damage to the region; most notably of which were the Great Atlantic Hurricane of 1938, Hurricane Carol, Hurricane Bob, and recently Hurricane Sandy (2012). The objective of this study was to develop an integrated modeling system that could be used as a forecasting/hindcasting tool to evaluate and communicate the risk coastal communities face from these coastal storms. This modeling system utilizes the ADvanced CIRCulation (ADCIRC) model for storm surge predictions and the Simulating Waves Nearshore (SWAN) model for the wave environment. These models were coupled, passing information to each other and computing over the same unstructured domain, allowing for the most accurate representation of the physical storm processes. The coupled SWAN-ADCIRC model was validated and has been set up to perform real-time forecast simulations (as well as hindcast). Modeled storm parameters were then passed to a coastal risk assessment tool. This tool, which is generic and universally applicable, generates spatial structural damage estimate maps on an individual structure basis for an area of interest. The required inputs for the coastal risk model included a detailed information about the individual structures, inundation levels, and wave heights for the selected region. Additionally, calculation of wind damage to structures was incorporated. The integrated coastal risk assessment system was then tested and applied to Charlestown, a small vulnerable coastal town along the southern shore of Rhode Island. The modeling system was applied to Hurricane Sandy and a synthetic storm. In both storm cases, effect of natural dunes on coastal risk was investigated. The resulting damage maps for the area (Charlestown) clearly showed that the dune eroded scenarios affected more structures, and increased the estimated damage. The system was also tested in forecast mode for a large Nor’Easters: Stella (March 2017). The results showed a good performance of the coupled model in forecast mode when compared to observations. Finally, a nearshore model XBeach was then nested within this regional grid (ADCIRC-SWAN) to simulate nearshore sediment transport processes and coastal erosion. Hurricane Irene (2011) was used to validate XBeach, on the basis of a unique beach profile dataset at the region. XBeach showed a relatively good performance, being able to estimate eroded volumes along the beach transects with a mean error of 16%. The validated model was then used to analyze the effectiveness of several erosion mitigation methods that were recommended in a recent study of coastal erosion in New England: beach nourishment, coastal bank (engineered core), and submerged breakwater as well as artificial surfing reef. It was shown that beach nourishment and coastal banks perform better to mitigate shoreline retreat and coastal erosion.

Keywords: ADCIRC, coastal flooding, storm surge, coastal risk assessment, living shorelines

Procedia PDF Downloads 85
7 Analytical Model of Locomotion of a Thin-Film Piezoelectric 2D Soft Robot Including Gravity Effects

Authors: Zhiwu Zheng, Prakhar Kumar, Sigurd Wagner, Naveen Verma, James C. Sturm

Abstract:

Soft robots have drawn great interest recently due to a rich range of possible shapes and motions they can take on to address new applications, compared to traditional rigid robots. Large-area electronics (LAE) provides a unique platform for creating soft robots by leveraging thin-film technology to enable the integration of a large number of actuators, sensors, and control circuits on flexible sheets. However, the rich shapes and motions possible, especially when interacting with complex environments, pose significant challenges to forming well-generalized and robust models necessary for robot design and control. In this work, we describe an analytical model for predicting the shape and locomotion of a flexible (steel-foil-based) piezoelectric-actuated 2D robot based on Euler-Bernoulli beam theory. It is nominally (unpowered) lying flat on the ground, and when powered, its shape is controlled by an array of piezoelectric thin-film actuators. Key features of the models are its ability to incorporate the significant effects of gravity on the shape and to precisely predict the spatial distribution of friction against the contacting surfaces, necessary for determining inchworm-type motion. We verified the model by developing a distributed discrete element representation of a continuous piezoelectric actuator and by comparing its analytical predictions to discrete-element robot simulations using PyBullet. Without gravity, predicting the shape of a sheet with a linear array of piezoelectric actuators at arbitrary voltages is straightforward. However, gravity significantly distorts the shape of the sheet, causing some segments to flatten against the ground. Our work includes the following contributions: (i) A self-consistent approach was developed to exactly determine which parts of the soft robot are lifted off the ground, and the exact shape of these sections, for an arbitrary array of piezoelectric voltages and configurations. (ii) Inchworm-type motion relies on controlling the relative friction with the ground surface in different sections of the robot. By adding torque-balance to our model and analyzing shear forces, the model can then determine the exact spatial distribution of the vertical force that the ground is exerting on the soft robot. Through this, the spatial distribution of friction forces between ground and robot can be determined. (iii) By combining this spatial friction distribution with the shape of the soft robot, in the function of time as piezoelectric actuator voltages are changed, the inchworm-type locomotion of the robot can be determined. As a practical example, we calculated the performance of a 5-actuator system on a 50-µm thick steel foil. Piezoelectric properties of commercially available thin-film piezoelectric actuators were assumed. The model predicted inchworm motion of up to 200 µm per step. For independent verification, we also modelled the system using PyBullet, a discrete-element robot simulator. To model a continuous thin-film piezoelectric actuator, we broke each actuator into multiple segments, each of which consisted of two rigid arms with appropriate mass connected with a 'motor' whose torque was set by the applied actuator voltage. Excellent agreement between our analytical model and the discrete-element simulator was shown for both for the full deformation shape and motion of the robot.

Keywords: analytical modeling, piezoelectric actuators, soft robot locomotion, thin-film technology

Procedia PDF Downloads 144
6 A Copula-Based Approach for the Assessment of Severity of Illness and Probability of Mortality: An Exploratory Study Applied to Intensive Care Patients

Authors: Ainura Tursunalieva, Irene Hudson

Abstract:

Continuous improvement of both the quality and safety of health care is an important goal in Australia and internationally. The intensive care unit (ICU) receives patients with a wide variety of and severity of illnesses. Accurately identifying patients at risk of developing complications or dying is crucial to increasing healthcare efficiency. Thus, it is essential for clinicians and researchers to have a robust framework capable of evaluating the risk profile of a patient. ICU scoring systems provide such a framework. The Acute Physiology and Chronic Health Evaluation III and the Simplified Acute Physiology Score II are ICU scoring systems frequently used for assessing the severity of acute illness. These scoring systems collect multiple risk factors for each patient including physiological measurements then render the assessment outcomes of individual risk factors into a single numerical value. A higher score is related to a more severe patient condition. Furthermore, the Mortality Probability Model II uses logistic regression based on independent risk factors to predict a patient’s probability of mortality. An important overlooked limitation of SAPS II and MPM II is that they do not, to date, include interaction terms between a patient’s vital signs. This is a prominent oversight as it is likely there is an interplay among vital signs. The co-existence of certain conditions may pose a greater health risk than when these conditions exist independently. One barrier to including such interaction terms in predictive models is the dimensionality issue as it becomes difficult to use variable selection. We propose an innovative scoring system which takes into account a dependence structure among patient’s vital signs, such as systolic and diastolic blood pressures, heart rate, pulse interval, and peripheral oxygen saturation. Copulas will capture the dependence among normally distributed and skewed variables as some of the vital sign distributions are skewed. The estimated dependence parameter will then be incorporated into the traditional scoring systems to adjust the points allocated for the individual vital sign measurements. The same dependence parameter will also be used to create an alternative copula-based model for predicting a patient’s probability of mortality. The new copula-based approach will accommodate not only a patient’s trajectories of vital signs but also the joint dependence probabilities among the vital signs. We hypothesise that this approach will produce more stable assessments and lead to more time efficient and accurate predictions. We will use two data sets: (1) 250 ICU patients admitted once to the Chui Regional Hospital (Kyrgyzstan) and (2) 37 ICU patients’ agitation-sedation profiles collected by the Hunter Medical Research Institute (Australia). Both the traditional scoring approach and our copula-based approach will be evaluated using the Brier score to indicate overall model performance, the concordance (or c) statistic to indicate the discriminative ability (or area under the receiver operating characteristic (ROC) curve), and goodness-of-fit statistics for calibration. We will also report discrimination and calibration values and establish visualization of the copulas and high dimensional regions of risk interrelating two or three vital signs in so-called higher dimensional ROCs.

Keywords: copula, intensive unit scoring system, ROC curves, vital sign dependence

Procedia PDF Downloads 130
5 Simulation of Multistage Extraction Process of Co-Ni Separation Using Ionic Liquids

Authors: Hongyan Chen, Megan Jobson, Andrew J. Masters, Maria Gonzalez-Miquel, Simon Halstead, Mayri Diaz de Rienzo

Abstract:

Ionic liquids offer excellent advantages over conventional solvents for industrial extraction of metals from aqueous solutions, where such extraction processes bring opportunities for recovery, reuse, and recycling of valuable resources and more sustainable production pathways. Recent research on the use of ionic liquids for extraction confirms their high selectivity and low volatility, but there is relatively little focus on how their properties can be best exploited in practice. This work addresses gaps in research on process modelling and simulation, to support development, design, and optimisation of these processes, focusing on the separation of the highly similar transition metals, cobalt, and nickel. The study exploits published experimental results, as well as new experimental results, relating to the separation of Co and Ni using trihexyl (tetradecyl) phosphonium chloride. This extraction agent is attractive because it is cheaper, more stable and less toxic than fluorinated hydrophobic ionic liquids. This process modelling work concerns selection and/or development of suitable models for the physical properties, distribution coefficients, for mass transfer phenomena, of the extractor unit and of the multi-stage extraction flowsheet. The distribution coefficient model for cobalt and HCl represents an anion exchange mechanism, supported by the literature and COSMO-RS calculations. Parameters of the distribution coefficient models are estimated by fitting the model to published experimental extraction equilibrium results. The mass transfer model applies Newman’s hard sphere model. Diffusion coefficients in the aqueous phase are obtained from the literature, while diffusion coefficients in the ionic liquid phase are fitted to dynamic experimental results. The mass transfer area is calculated from the surface to mean diameter of liquid droplets of the dispersed phase, estimated from the Weber number inside the extractor. New experiments measure the interfacial tension between the aqueous and ionic phases. The empirical models for predicting the density and viscosity of solutions under different metal loadings are also fitted to new experimental data. The extractor is modelled as a continuous stirred tank reactor with mass transfer between the two phases and perfect phase separation of the outlet flows. A multistage separation flowsheet simulation is set up to replicate a published experiment and compare model predictions with the experimental results. This simulation model is implemented in gPROMS software for dynamic process simulation. The results of single stage and multi-stage flowsheet simulations are shown to be in good agreement with the published experimental results. The estimated diffusion coefficient of cobalt in the ionic liquid phase is in reasonable agreement with published data for the diffusion coefficients of various metals in this ionic liquid. A sensitivity study with this simulation model demonstrates the usefulness of the models for process design. The simulation approach has potential to be extended to account for other metals, acids, and solvents for process development, design, and optimisation of extraction processes applying ionic liquids for metals separations, although a lack of experimental data is currently limiting the accuracy of models within the whole framework. Future work will focus on process development more generally and on extractive separation of rare earths using ionic liquids.

Keywords: distribution coefficient, mass transfer, COSMO-RS, flowsheet simulation, phosphonium

Procedia PDF Downloads 160
4 Phytochemicals and Photosynthesis of Grape Berry Exocarp and Seed (Vitis vinifera, cv. Alvarinho): Effects of Foliar Kaolin and Irrigation

Authors: Andreia Garrido, Artur Conde, Ana Cunha, Ric De Vos

Abstract:

Climate changes predictions point to increases in abiotic stress for crop plants in Portugal, like pronounced temperature variation and decreased precipitation, which will have negative impact on grapevine physiology and consequently, on grape berry and wine quality. Short-term mitigation strategies have, therefore, been implemented to alleviate the impacts caused by adverse climatic periods. These strategies include foliar application of kaolin, an inert mineral, which has radiation reflection proprieties that decreases stress from excessive heat/radiation absorbed by its leaves, as well as smart irrigation strategies to avoid water stress. However, little is known about the influence of these mitigation measures on grape berries, neither on the photosynthetic activity nor on the photosynthesis-related metabolic profiles of its various tissues. Moreover, the role of fruit photosynthesis on berry quality is poorly understood. The main objective of our work was to assess the effects of kaolin and irrigation treatments on the photosynthetic activity of grape berry tissues (exocarp and seeds) and on their global metabolic profile, also investigating their possible relationship. We therefore collected berries of field-grown plants of the white grape variety Alvarinho from two distinct microclimates, i.e. from clusters exposed to high light (HL, 150 µmol photons m⁻² s⁻¹) and low light (LL, 50 µmol photons m⁻² s⁻¹), from both kaolin and non-kaolin (control) treated plants at three fruit developmental stages (green, véraison and mature). Plant irrigation was applied after harvesting the green berries, which also enabled comparison of véraison and mature berries from irrigated and non-irrigated growth conditions. Photosynthesis was assessed by pulse amplitude modulated chlorophyll fluorescence imaging analysis, and the metabolite profile of both tissues was assessed by complementary metabolomics approaches. Foliar kaolin application resulted in, for instance, an increased photosynthetic activity of the exocarp of LL-grown berries at green developmental stage, as compared to the control non-kaolin treatment, with a concomitant increase in the levels of several lipid-soluble isoprenoids (chlorophylls, carotenoids, and tocopherols). The exocarp of mature berries grown at HL microclimate on kaolin-sprayed non-irrigated plants had higher total sugar levels content than all other treatments, suggesting that foliar application of this mineral results in an increased accumulation of photoassimilates in mature berries. Unbiased liquid chromatography-mass spectrometry-based profiling of semi-polar compounds followed by ASCA (ANOVA simultaneous component analysis) and ANOVA statistical analysis indicated that kaolin had no or inconsistent effect on the flavonoid and phenylpropanoid composition in both seed and exocarp at any developmental stage; in contrast, both microclimate and irrigation influenced the level of several of these compounds depending on berry ripening stage. Overall, our study provides more insight into the effects of mitigation strategies on berry tissue photosynthesis and phytochemistry, under contrasting conditions of cluster light microclimate. We hope that this may contribute to develop sustainable management in vineyards and to maintain grape berries and wines with high quality even at increasing abiotic stress challenges.

Keywords: climate change, grape berry tissues, metabolomics, mitigation strategies

Procedia PDF Downloads 92
3 Investigation of Delamination Process in Adhesively Bonded Hardwood Elements under Changing Environmental Conditions

Authors: M. M. Hassani, S. Ammann, F. K. Wittel, P. Niemz, H. J. Herrmann

Abstract:

Application of engineered wood, especially in the form of glued-laminated timbers has increased significantly. Recent progress in plywood made of high strength and high stiffness hardwoods, like European beech, gives designers in general more freedom by increased dimensional stability and load-bearing capacity. However, the strong hygric dependence of basically all mechanical properties renders many innovative ideas futile. The tendency of hardwood for higher moisture sorption and swelling coefficients lead to significant residual stresses in glued-laminated configurations, cross-laminated patterns in particular. These stress fields cause initiation and evolution of cracks in the bond-lines resulting in: interfacial de-bonding, loss of structural integrity, and reduction of load-carrying capacity. Subsequently, delamination of glued-laminated timbers made of hardwood elements can be considered as the dominant failure mechanism in such composite elements. In addition, long-term creep and mechano-sorption under changing environmental conditions lead to loss of stiffness and can amplify delamination growth over the lifetime of a structure even after decades. In this study we investigate the delamination process of adhesively bonded hardwood (European beech) elements subjected to changing climatic conditions. To gain further insight into the long-term performance of adhesively bonded elements during the design phase of new products, the development and verification of an authentic moisture-dependent constitutive model for various species is of great significance. Since up to now, a comprehensive moisture-dependent rheological model comprising all possibly emerging deformation mechanisms was missing, a 3D orthotropic elasto-plastic, visco-elastic, mechano-sorptive material model for wood, with all material constants being defined as a function of moisture content, was developed. Apart from the solid wood adherends, adhesive layer also plays a crucial role in the generation and distribution of the interfacial stresses. Adhesive substance can be treated as a continuum layer constructed from finite elements, represented as a homogeneous and isotropic material. To obtain a realistic assessment on the mechanical performance of the adhesive layer and a detailed look at the interfacial stress distributions, a generic constitutive model including all potentially activated deformation modes, namely elastic, plastic, and visco-elastic creep was developed. We focused our studies on the three most common adhesive systems for structural timber engineering: one-component polyurethane adhesive (PUR), melamine-urea-formaldehyde (MUF), and phenol-resorcinol-formaldehyde (PRF). The corresponding numerical integration approaches, with additive decomposition of the total strain are implemented within the ABAQUS FEM environment by means of user subroutine UMAT. To predict the true stress state, we perform a history dependent sequential moisture-stress analysis using the developed material models for both wood substrate and adhesive layer. Prediction of the delamination process is founded on the fracture mechanical properties of the adhesive bond-line, measured under different levels of moisture content and application of the cohesive interface elements. Finally, we compare the numerical predictions with the experimental observations of de-bonding in glued-laminated samples under changing environmental conditions.

Keywords: engineered wood, adhesive, material model, FEM analysis, fracture mechanics, delamination

Procedia PDF Downloads 407
2 Revolutionizing Financial Forecasts: Enhancing Predictions with Graph Convolutional Networks (GCN) - Long Short-Term Memory (LSTM) Fusion

Authors: Ali Kazemi

Abstract:

Those within the volatile and interconnected international economic markets, appropriately predicting market trends, hold substantial fees for traders and financial establishments. Traditional device mastering strategies have made full-size strides in forecasting marketplace movements; however, monetary data's complicated and networked nature calls for extra sophisticated processes. This observation offers a groundbreaking method for monetary marketplace prediction that leverages the synergistic capability of Graph Convolutional Networks (GCNs) and Long Short-Term Memory (LSTM) networks. Our suggested algorithm is meticulously designed to forecast the traits of inventory market indices and cryptocurrency costs, utilizing a comprehensive dataset spanning from January 1, 2015, to December 31, 2023. This era, marked by sizable volatility and transformation in financial markets, affords a solid basis for schooling and checking out our predictive version. Our algorithm integrates diverse facts to construct a dynamic economic graph that correctly reflects market intricacies. We meticulously collect opening, closing, and high and low costs daily for key inventory marketplace indices (e.g., S&P 500, NASDAQ) and widespread cryptocurrencies (e.g., Bitcoin, Ethereum), ensuring a holistic view of marketplace traits. Daily trading volumes are also incorporated to seize marketplace pastime and liquidity, providing critical insights into the market's shopping for and selling dynamics. Furthermore, recognizing the profound influence of the monetary surroundings on financial markets, we integrate critical macroeconomic signs with hobby fees, inflation rates, GDP increase, and unemployment costs into our model. Our GCN algorithm is adept at learning the relational patterns amongst specific financial devices represented as nodes in a comprehensive market graph. Edges in this graph encapsulate the relationships based totally on co-movement styles and sentiment correlations, enabling our version to grasp the complicated community of influences governing marketplace moves. Complementing this, our LSTM algorithm is trained on sequences of the spatial-temporal illustration discovered through the GCN, enriched with historic fee and extent records. This lets the LSTM seize and expect temporal marketplace developments accurately. Inside the complete assessment of our GCN-LSTM algorithm across the inventory marketplace and cryptocurrency datasets, the version confirmed advanced predictive accuracy and profitability compared to conventional and opportunity machine learning to know benchmarks. Specifically, the model performed a Mean Absolute Error (MAE) of 0.85%, indicating high precision in predicting day-by-day charge movements. The RMSE was recorded at 1.2%, underscoring the model's effectiveness in minimizing tremendous prediction mistakes, which is vital in volatile markets. Furthermore, when assessing the model's predictive performance on directional market movements, it achieved an accuracy rate of 78%, significantly outperforming the benchmark models, averaging an accuracy of 65%. This high degree of accuracy is instrumental for techniques that predict the course of price moves. This study showcases the efficacy of mixing graph-based totally and sequential deep learning knowledge in economic marketplace prediction and highlights the fee of a comprehensive, records-pushed evaluation framework. Our findings promise to revolutionize investment techniques and hazard management practices, offering investors and economic analysts a powerful device to navigate the complexities of cutting-edge economic markets.

Keywords: financial market prediction, graph convolutional networks (GCNs), long short-term memory (LSTM), cryptocurrency forecasting

Procedia PDF Downloads 26
1 Enhancing Disaster Resilience: Advanced Natural Hazard Assessment and Monitoring

Authors: Mariza Kaskara, Stella Girtsou, Maria Prodromou, Alexia Tsouni, Christodoulos Mettas, Stavroula Alatza, Kyriaki Fotiou, Marios Tzouvaras, Charalampos Kontoes, Diofantos Hadjimitsis

Abstract:

Natural hazard assessment and monitoring are crucial components in managing the risks associated with fires, floods, and geohazards, particularly in regions prone to these natural disasters, such as Greece and Cyprus. Recent advancements in technology led to the development of state-of-the-art systems for assessing and monitoring these hazards. These technologies, developed by the BEYOND Center of Excellence of the National Observatory of Athens, have been successfully applied in Greece and are now set to be transferred to Cyprus. The implementation of these advanced technologies in Greece has significantly improved the country's ability to respond to these natural hazards. Enhancing disaster resilience is crucial as it significantly improves our ability to predict, prepare for, and mitigate the impacts of natural disasters, ultimately saving lives and reducing economic losses. For wildfire risk assessment, a scalar wildfire occurrence risk index has been created based on the predictions of machine learning models. Our objective was to train an ML model that learns to derive a fire susceptibility score when given as input a vector of features assigned to certain spatiotemporal coordinates. Predicting fire danger is crucial for the sustainable management of forest fires as it provides essential information for designing effective prevention measures and facilitating response planning for potential fire incidents. For flood risk assessment, a multi-faceted approach has been employed, including the application of remote sensing techniques, the collection and processing of data from population, buildings, technical studies and field visits, as well as hydrological and hydraulic simulations. All input data are used to create precise flood hazard maps according to various flooding scenarios, detailed flood vulnerability and flood exposure maps, which finally produce the flood risk map. Critical points are identified, and mitigation measures are proposed for the worst-case scenario, namely, refuge areas are defined, and escape routes are designed. Flood risk maps can assist in raising awareness and save lives. For geohazards monitoring (e.g., landslides, subsidence), synthetic aperture radar (SAR) and optical satellite imagery have been combined with geomorphological and meteorological data and other landslide/ground deformation contributing factors. To monitor critical infrastructures, including dams, advanced InSAR (Interferometric SAR) methodologies are used for identifying surface movements through time. Monitoring these hazards provides valuable information for understanding processes and could lead to early warning systems to protect people and infrastructure. The success of these systems in Greece has paved the way for their transfer to Cyprus to enhance Cyprus's capabilities in natural hazard assessment and monitoring. This transfer is being made through knowledge transfer activities, fostering continuous collaboration between Greek and Cypriot experts. Furthermore, small demonstration actions are implemented to showcase the effectiveness of these technologies in real-world scenarios. In conclusion, the transfer of advanced natural hazard assessment technologies from Greece to Cyprus represents a significant step forward in enhancing the entire region's resilience to disasters. The EXCELSIOR project, funding this opportunity, is committed to empowering Cyprus with the tools and expertise needed to effectively manage and mitigate the risks associated with these natural hazards. Acknowledgment: Authors acknowledge the 'EXCELSIOR': ERATOSTHENES: Excellence Research Centre for Earth Surveillance and Space-Based Monitoring of the Environment H2020 Widespread Teaming project.

Keywords: earth observation, monitoring, natural hazards, remote sensing

Procedia PDF Downloads 12