Search results for: flood forecast
626 From Ondoy to Habagat: Comparison of the Community Coping Strategies between Barangay Tumana and Provident Village, Marikina City
Authors: Dinnah Feye H. Andal, Ann Laurice V. Salonga
Abstract:
The paper investigates the flooding event that was experienced by Marikina City residents during the onslaught of Tropical Storm Ondoy on September 26, 2009 and during the heavy downpour caused by the southwest monsoon (Habagat) on August 1-8, 2012. Typhoon Ketsana, locally known as Tropical Storm Ondoy, devastated the whole of Marikina City, displacing a lot of people from their homes and damages properties as well, as flood rose at a very short period of time. Meanwhile, the massive amount of rain water brought by the southwest monsoon lasted for a week that also caused flooding to different parts of Metro Manila including Marikina City. This paper examines how the respondents’ experiences of the flooding caused by Tropical Storm Ondoy informed the coping strategies that the households in Barangay Tumana and Provident Village employed during the flooding brought by the southwest monsoon rains. Specifically, the research compares the coping strategies to flood hazards between residents of Barangay Tumana and Provident Village before, during and after the flooding caused by the southwest monsoon rains. Both study sites have relatively low elevation and are located along rivers and creeks which make them highly susceptible to flood. Interviews with affected residents were undertaken to understand how a household's coping strategies contribute to the development of community coping strategies at the respective neighborhood level. Based from the findings, income levels, local politics, religion and social relations between and among neighbors affect the way household and community coping strategies differ in the two case study sites.Keywords: community coping strategies, Habagat, Marikina, Ondoy
Procedia PDF Downloads 315625 A Smart Sensor Network Approach Using Affordable River Water Level Sensors
Authors: Dian Zhang, Brendan Heery, Maria O’Neill, Ciprian Briciu-Burghina, Noel E. O’Connor, Fiona Regan
Abstract:
Recent developments in sensors, wireless data communication and the cloud computing have brought the sensor web to a whole new generation. The introduction of the concept of ‘Internet of Thing (IoT)’ has brought the sensor research into a new level, which involves the developing of long lasting, low cost, environment friendly and smart sensors; new wireless data communication technologies; big data analytics algorithms and cloud based solutions that are tailored to large scale smart sensor network. The next generation of smart sensor network consists of several layers: physical layer, where all the smart sensors resident and data pre-processes occur, either on the sensor itself or field gateway; data transmission layer, where data and instructions exchanges happen; the data process layer, where meaningful information is extracted and organized from the pre-process data stream. There are many definitions of smart sensor, however, to summarize all these definitions, a smart sensor must be Intelligent and Adaptable. In future large scale sensor network, collected data are far too large for traditional applications to send, store or process. The sensor unit must be intelligent that pre-processes collected data locally on board (this process may occur on field gateway depends on the sensor network structure). In this case study, three smart sensing methods, corresponding to simple thresholding, statistical model and machine learning based MoPBAS method, are introduced and their strength and weakness are discussed as an introduction to the smart sensing concept. Data fusion, the integration of data and knowledge from multiple sources, are key components of the next generation smart sensor network. For example, in the water level monitoring system, weather forecast can be extracted from external sources and if a heavy rainfall is expected, the server can send instructions to the sensor notes to, for instance, increase the sampling rate or switch on the sleeping mode vice versa. In this paper, we describe the deployment of 11 affordable water level sensors in the Dublin catchment. The objective of this paper is to use the deployed river level sensor network at the Dodder catchment in Dublin, Ireland as a case study to give a vision of the next generation of a smart sensor network for flood monitoring to assist agencies in making decisions about deploying resources in the case of a severe flood event. Some of the deployed sensors are located alongside traditional water level sensors for validation purposes. Using the 11 deployed river level sensors in a network as a case study, a vision of the next generation of smart sensor network is proposed. Each key component of the smart sensor network is discussed, which hopefully inspires the researchers who are working in the sensor research domain.Keywords: smart sensing, internet of things, water level sensor, flooding
Procedia PDF Downloads 381624 Application of Support Vector Machines in Forecasting Non-Residential
Authors: Wiwat Kittinaraporn, Napat Harnpornchai, Sutja Boonyachut
Abstract:
This paper deals with the application of a novel neural network technique, so-called Support Vector Machine (SVM). The objective of this study is to explore the variable and parameter of forecasting factors in the construction industry to build up forecasting model for construction quantity in Thailand. The scope of the research is to study the non-residential construction quantity in Thailand. There are 44 sets of yearly data available, ranging from 1965 to 2009. The correlation between economic indicators and construction demand with the lag of one year was developed by Apichat Buakla. The selected variables are used to develop SVM models to forecast the non-residential construction quantity in Thailand. The parameters are selected by using ten-fold cross-validation method. The results are indicated in term of Mean Absolute Percentage Error (MAPE). The MAPE value for the non-residential construction quantity predicted by Epsilon-SVR in corporation with Radial Basis Function (RBF) of kernel function type is 5.90. Analysis of the experimental results show that the support vector machine modelling technique can be applied to forecast construction quantity time series which is useful for decision planning and management purpose.Keywords: forecasting, non-residential, construction, support vector machines
Procedia PDF Downloads 434623 Flood Risk Management in the Semi-Arid Regions of Lebanon - Case Study “Semi Arid Catchments, Ras Baalbeck and Fekha”
Authors: Essam Gooda, Chadi Abdallah, Hamdi Seif, Safaa Baydoun, Rouya Hdeib, Hilal Obeid
Abstract:
Floods are common natural disaster occurring in semi-arid regions in Lebanon. This results in damage to human life and deterioration of environment. Despite their destructive nature and their immense impact on the socio-economy of the region, flash floods have not received adequate attention from policy and decision makers. This is mainly because of poor understanding of the processes involved and measures needed to manage the problem. The current understanding of flash floods remains at the level of general concepts; most policy makers have yet to recognize that flash floods are distinctly different from normal riverine floods in term of causes, propagation, intensity, impacts, predictability, and management. Flash floods are generally not investigated as a separate class of event but are rather reported as part of the overall seasonal flood situation. As a result, Lebanon generally lacks policies, strategies, and plans relating specifically to flash floods. Main objective of this research is to improve flash flood prediction by providing new knowledge and better understanding of the hydrological processes governing flash floods in the East Catchments of El Assi River. This includes developing rainstorm time distribution curves that are unique for this type of study region; analyzing, investigating, and developing a relationship between arid watershed characteristics (including urbanization) and nearby villages flow flood frequency in Ras Baalbeck and Fekha. This paper discusses different levels of integration approach¬es between GIS and hydrological models (HEC-HMS & HEC-RAS) and presents a case study, in which all the tasks of creating model input, editing data, running the model, and displaying output results. The study area corresponds to the East Basin (Ras Baalbeck & Fakeha), comprising nearly 350 km2 and situated in the Bekaa Valley of Lebanon. The case study presented in this paper has a database which is derived from Lebanese Army topographic maps for this region. Using ArcMap to digitizing the contour lines, streams & other features from the topographic maps. The digital elevation model grid (DEM) is derived for the study area. The next steps in this research are to incorporate rainfall time series data from Arseal, Fekha and Deir El Ahmar stations to build a hydrologic data model within a GIS environment and to combine ArcGIS/ArcMap, HEC-HMS & HEC-RAS models, in order to produce a spatial-temporal model for floodplain analysis at a regional scale. In this study, HEC-HMS and SCS methods were chosen to build the hydrologic model of the watershed. The model then calibrated using flood event that occurred between 7th & 9th of May 2014 which considered exceptionally extreme because of the length of time the flows lasted (15 hours) and the fact that it covered both the watershed of Aarsal and Ras Baalbeck. The strongest reported flood in recent times lasted for only 7 hours covering only one watershed. The calibrated hydrologic model is then used to build the hydraulic model & assessing of flood hazards maps for the region. HEC-RAS Model is used in this issue & field trips were done for the catchments in order to calibrated both Hydrologic and Hydraulic models. The presented models are a kind of flexible procedures for an ungaged watershed. For some storm events it delivers good results, while for others, no parameter vectors can be found. In order to have a general methodology based on these ideas, further calibration and compromising of results on the dependence of many flood events parameters and catchment properties is required.Keywords: flood risk management, flash flood, semi arid region, El Assi River, hazard maps
Procedia PDF Downloads 478622 Predicting Photovoltaic Energy Profile of Birzeit University Campus Based on Weather Forecast
Authors: Muhammad Abu-Khaizaran, Ahmad Faza’, Tariq Othman, Yahia Yousef
Abstract:
This paper presents a study to provide sufficient and reliable information about constructing a Photovoltaic energy profile of the Birzeit University campus (BZU) based on the weather forecast. The developed Photovoltaic energy profile helps to predict the energy yield of the Photovoltaic systems based on the weather forecast and hence helps planning energy production and consumption. Two models will be developed in this paper; a Clear Sky Irradiance model and a Cloud-Cover Radiation model to predict the irradiance for a clear sky day and a cloudy day, respectively. The adopted procedure for developing such models takes into consideration two levels of abstraction. First, irradiance and weather data were acquired by a sensory (measurement) system installed on the rooftop of the Information Technology College building at Birzeit University campus. Second, power readings of a fully operational 51kW commercial Photovoltaic system installed in the University at the rooftop of the adjacent College of Pharmacy-Nursing and Health Professions building are used to validate the output of a simulation model and to help refine its structure. Based on a comparison between a mathematical model, which calculates Clear Sky Irradiance for the University location and two sets of accumulated measured data, it is found that the simulation system offers an accurate resemblance to the installed PV power station on clear sky days. However, these comparisons show a divergence between the expected energy yield and actual energy yield in extreme weather conditions, including clouding and soiling effects. Therefore, a more accurate prediction model for irradiance that takes into consideration weather factors, such as relative humidity and cloudiness, which affect irradiance, was developed; Cloud-Cover Radiation Model (CRM). The equivalent mathematical formulas implement corrections to provide more accurate inputs to the simulation system. The results of the CRM show a very good match with the actual measured irradiance during a cloudy day. The developed Photovoltaic profile helps in predicting the output energy yield of the Photovoltaic system installed at the University campus based on the predicted weather conditions. The simulation and practical results for both models are in a very good match.Keywords: clear-sky irradiance model, cloud-cover radiation model, photovoltaic, weather forecast
Procedia PDF Downloads 132621 Rational Approach to the Design of a Sustainable Drainage System for Permanent Site of Federal Polytechnic Oko: A Case Study for Flood Mitigation and Environmental Management
Authors: Fortune Chibuike Onyia, Femi Ogundeji Ayodele
Abstract:
The design of a drainage system at the permanent site of Federal Polytechnic Oko in Anambra State is critical for mitigating flooding, managing surface runoff, and ensuring environmental sustainability. The design process employed a comprehensive analysis involving topographical surveys, hydraulic modeling, and the assessment of local soil types to ensure stability and efficient water conveyance. Proper slope gradients were considered to maintain adequate flow velocities and avoid sediment deposition, which could hinder long-term performance. From the result, the channel size estimated was 0.199m by 0.0199m and 0.0199m². This study proposed a channel size of 1.4m depth by 0.5m width and 0.7m², optimized to accommodate the anticipated peak flow resulting from heavy rainfall and storm-water events. This sizing is based on hydrological data, which takes into account rainfall intensity, runoff coefficients, and catchment area characteristics. The objective is to effectively convey storm-water while preventing overflow, erosion, and subsequent damage to infrastructure and properties. This sustainable approach incorporates provisions for maintenance and aligns with urban drainage standards to enhance durability and reliability. Implementing this drainage system will mitigate flood risks, safeguard campus facilities, improve overall water management, and contribute to the development of resilient infrastructure at Federal Polytechnic Oko.Keywords: flood mitigation, drainage system, sustainable design, environmental management
Procedia PDF Downloads 7620 Forecasting the Fluctuation of Currency Exchange Rate Using Random Forest
Authors: Lule Basha, Eralda Gjika
Abstract:
The exchange rate is one of the most important economic variables, especially for a small, open economy such as Albania. Its effect is noticeable in one country's competitiveness, trade and current account, inflation, wages, domestic economic activity, and bank stability. This study investigates the fluctuation of Albania’s exchange rates using monthly average foreign currency, Euro (Eur) to Albanian Lek (ALL) exchange rate with a time span from January 2008 to June 2021, and the macroeconomic factors that have a significant effect on the exchange rate. Initially, the Random Forest Regression algorithm is constructed to understand the impact of economic variables on the behavior of monthly average foreign currencies exchange rates. Then the forecast of macro-economic indicators for 12 months was performed using time series models. The predicted values received are placed in the random forest model in order to obtain the average monthly forecast of the Euro to Albanian Lek (ALL) exchange rate for the period July 2021 to June 2022.Keywords: exchange rate, random forest, time series, machine learning, prediction
Procedia PDF Downloads 103619 Fuzzy Time Series- Markov Chain Method for Corn and Soybean Price Forecasting in North Carolina Markets
Authors: Selin Guney, Andres Riquelme
Abstract:
Among the main purposes of optimal and efficient forecasts of agricultural commodity prices is to guide the firms to advance the economic decision making process such as planning business operations and marketing decisions. Governments are also the beneficiaries and suppliers of agricultural price forecasts. They use this information to establish a proper agricultural policy, and hence, the forecasts affect social welfare and systematic errors in forecasts could lead to a misallocation of scarce resources. Various empirical approaches have been applied to forecast commodity prices that have used different methodologies. Most commonly-used approaches to forecast commodity sectors depend on classical time series models that assume values of the response variables are precise which is quite often not true in reality. Recently, this literature has mostly evolved to a consideration of fuzzy time series models that provide more flexibility in terms of the classical time series models assumptions such as stationarity, and large sample size requirement. Besides, fuzzy modeling approach allows decision making with estimated values under incomplete information or uncertainty. A number of fuzzy time series models have been developed and implemented over the last decades; however, most of them are not appropriate for forecasting repeated and nonconsecutive transitions in the data. The modeling scheme used in this paper eliminates this problem by introducing Markov modeling approach that takes into account both the repeated and nonconsecutive transitions. Also, the determination of length of interval is crucial in terms of the accuracy of forecasts. The problem of determining the length of interval arbitrarily is overcome and a methodology to determine the proper length of interval based on the distribution or mean of the first differences of series to improve forecast accuracy is proposed. The specific purpose of this paper is to propose and investigate the potential of a new forecasting model that integrates methodologies for determining the proper length of interval based on the distribution or mean of the first differences of series and Fuzzy Time Series- Markov Chain model. Moreover, the accuracy of the forecasting performance of proposed integrated model is compared to different univariate time series models and the superiority of proposed method over competing methods in respect of modelling and forecasting on the basis of forecast evaluation criteria is demonstrated. The application is to daily corn and soybean prices observed at three commercially important North Carolina markets; Candor, Cofield and Roaring River for corn and Fayetteville, Cofield and Greenville City for soybeans respectively. One main conclusion from this paper is that using fuzzy logic improves the forecast performance and accuracy; the effectiveness and potential benefits of the proposed model is confirmed with small selection criteria value such MAPE. The paper concludes with a discussion of the implications of integrating fuzzy logic and nonarbitrary determination of length of interval for the reliability and accuracy of price forecasts. The empirical results represent a significant contribution to our understanding of the applicability of fuzzy modeling in commodity price forecasts.Keywords: commodity, forecast, fuzzy, Markov
Procedia PDF Downloads 217618 Using Discriminant Analysis to Forecast Crime Rate in Nigeria
Authors: O. P. Popoola, O. A. Alawode, M. O. Olayiwola, A. M. Oladele
Abstract:
This research work is based on using discriminant analysis to forecast crime rate in Nigeria between 1996 and 2008. The work is interested in how gender (male and female) relates to offences committed against the government, against other properties, disturbance in public places, murder/robbery offences and other offences. The data used was collected from the National Bureau of Statistics (NBS). SPSS, the statistical package was used to analyse the data. Time plot was plotted on all the 29 offences gotten from the raw data. Eigenvalues and Multivariate tests, Wilks’ Lambda, standardized canonical discriminant function coefficients and the predicted classifications were estimated. The research shows that the distribution of the scores from each function is standardized to have a mean O and a standard deviation of 1. The magnitudes of the coefficients indicate how strongly the discriminating variable affects the score. In the predicted group membership, 172 cases that were predicted to commit crime against Government group, 66 were correctly predicted and 106 were incorrectly predicted. After going through the predicted classifications, we found out that most groups numbers that were correctly predicted were less than those that were incorrectly predicted.Keywords: discriminant analysis, DA, multivariate analysis of variance, MANOVA, canonical correlation, and Wilks’ Lambda
Procedia PDF Downloads 468617 ARIMA-GARCH, A Statistical Modeling for Epileptic Seizure Prediction
Authors: Salman Mohamadi, Seyed Mohammad Ali Tayaranian Hosseini, Hamidreza Amindavar
Abstract:
In this paper, we provide a procedure to analyze and model EEG (electroencephalogram) signal as a time series using ARIMA-GARCH to predict an epileptic attack. The heteroskedasticity of EEG signal is examined through the ARCH or GARCH, (Autore- gressive conditional heteroskedasticity, Generalized autoregressive conditional heteroskedasticity) test. The best ARIMA-GARCH model in AIC sense is utilized to measure the volatility of the EEG from epileptic canine subjects, to forecast the future values of EEG. ARIMA-only model can perform prediction, but the ARCH or GARCH model acting on the residuals of ARIMA attains a con- siderable improved forecast horizon. First, we estimate the best ARIMA model, then different orders of ARCH and GARCH modelings are surveyed to determine the best heteroskedastic model of the residuals of the mentioned ARIMA. Using the simulated conditional variance of selected ARCH or GARCH model, we suggest the procedure to predict the oncoming seizures. The results indicate that GARCH modeling determines the dynamic changes of variance well before the onset of seizure. It can be inferred that the prediction capability comes from the ability of the combined ARIMA-GARCH modeling to cover the heteroskedastic nature of EEG signal changes.Keywords: epileptic seizure prediction , ARIMA, ARCH and GARCH modeling, heteroskedasticity, EEG
Procedia PDF Downloads 406616 Assessing Artificial Neural Network Models on Forecasting the Return of Stock Market Index
Authors: Hamid Rostami Jaz, Kamran Ameri Siahooei
Abstract:
Up to now different methods have been used to forecast the index returns and the index rate. Artificial intelligence and artificial neural networks have been one of the methods of index returns forecasting. This study attempts to carry out a comparative study on the performance of different Radial Base Neural Network and Feed-Forward Perceptron Neural Network to forecast investment returns on the index. To achieve this goal, the return on investment in Tehran Stock Exchange index is evaluated and the performance of Radial Base Neural Network and Feed-Forward Perceptron Neural Network are compared. Neural networks performance test is applied based on the least square error in two approaches of in-sample and out-of-sample. The research results show the superiority of the radial base neural network in the in-sample approach and the superiority of perceptron neural network in the out-of-sample approach.Keywords: exchange index, forecasting, perceptron neural network, Tehran stock exchange
Procedia PDF Downloads 464615 How Participatory Climate Information Services Assist Farmers to Uptake Rice Disease Forecasts and Manage Diseases in Advance: Evidence from Coastal Bangladesh
Authors: Moriom Akter Mousumi, Spyridon Paparrizos, Fulco Ludwig
Abstract:
Rice yield reduction due to climate change-induced disease occurrence is becoming a great concern for coastal farmers of Bangladesh. The development of participatory climate information services (CIS) based on farmers’ needs could implicitly facilitate farmers to get disease forecasts and make better decisions to manage diseases. Therefore, this study aimed to investigate how participatory climate information services assist coastal rice farmers to take up rice disease forecasts and better manage rice diseases by improving their informed decision-making. Through participatory approaches, we developed a tailor-made agrometeorological service through the DROP app to forecast rice diseases and manage them in advance. During farmers field schools (FFS) we communicated 7-day disease forecasts during face-to-face weekly meetings using printed paper and, messenger app derived from DROP app. Results show that the majority of the farmers understand disease forecasts through visualization, symbols, and text. The majority of them use disease forecast information directly from the DROP app followed by face-to-face meetings, messenger app, and printed paper. Farmers participation and engagement during capacity building training at FFS also assist them in making more informed decisions and improved management of diseases using both preventive measures and chemical measures throughout the rice cultivation period. We conclude that the development of participatory CIS and the associated capacity-building and training of farmers has increased farmers' understanding and uptake of disease forecasts to better manage of rice diseases. Participatory services such as the DROP app offer great potential as an adaptation option for climate-smart rice production under changing climatic conditions.Keywords: participatory climate service, disease forecast, disease management, informed decision making, coastal Bangladesg
Procedia PDF Downloads 46614 Fuzzy Logic Classification Approach for Exponential Data Set in Health Care System for Predication of Future Data
Authors: Manish Pandey, Gurinderjit Kaur, Meenu Talwar, Sachin Chauhan, Jagbir Gill
Abstract:
Health-care management systems are a unit of nice connection as a result of the supply a straightforward and fast management of all aspects relating to a patient, not essentially medical. What is more, there are unit additional and additional cases of pathologies during which diagnosing and treatment may be solely allotted by victimization medical imaging techniques. With associate ever-increasing prevalence, medical pictures area unit directly acquired in or regenerate into digital type, for his or her storage additionally as sequent retrieval and process. Data Mining is the process of extracting information from large data sets through using algorithms and Techniques drawn from the field of Statistics, Machine Learning and Data Base Management Systems. Forecasting may be a prediction of what's going to occur within the future, associated it's an unsure method. Owing to the uncertainty, the accuracy of a forecast is as vital because the outcome foretold by foretelling the freelance variables. A forecast management should be wont to establish if the accuracy of the forecast is within satisfactory limits. Fuzzy regression strategies have normally been wont to develop shopper preferences models that correlate the engineering characteristics with shopper preferences relating to a replacement product; the patron preference models offer a platform, wherever by product developers will decide the engineering characteristics so as to satisfy shopper preferences before developing the merchandise. Recent analysis shows that these fuzzy regression strategies area units normally will not to model client preferences. We tend to propose a Testing the strength of Exponential Regression Model over regression toward the mean Model.Keywords: health-care management systems, fuzzy regression, data mining, forecasting, fuzzy membership function
Procedia PDF Downloads 279613 Tidal Current Behaviors and Remarkable Bathymetric Change in the South-Western Part of Khor Abdullah, Kuwait
Authors: Ahmed M. Al-Hasem
Abstract:
A study of the tidal current behavior and bathymetric changes was undertaken in order to establish an information base for future coastal management. The average velocity for tidal current was 0.46 m/s and the maximum velocity was 1.08 m/s during ebb tide. During spring tides, maximum velocities range from 0.90 m/s to 1.08 m/s, whereas maximum velocities vary from 0.40 m/s to 0.60 m/s during neap tides. Despite greater current velocities during flood tide, the bathymetric features enhance the dominance of the ebb tide. This can be related to the abundance of fine sediments from the ebb current approaching the study area, and the relatively coarser sediment from the approaching flood current. Significant bathymetric changes for the period from 1985 to 1998 were found with dominance of erosion process. Approximately 96.5% of depth changes occurred within the depth change classes of -5 m to 5 m. The high erosion processes within the study area will subsequently result in high accretion processes, particularly in the north, the location of the proposed Boubyan Port and its navigation channel.Keywords: bathymetric change, Boubyan island, GIS, Khor Abdullah, tidal current behavior
Procedia PDF Downloads 289612 The Impact of Land Cover Change on Stream Discharges and Water Resources in Luvuvhu River Catchment, Vhembe District, Limpopo Province, South Africa
Authors: P. M. Kundu, L. R. Singo, J. O. Odiyo
Abstract:
Luvuvhu River catchment in South Africa experiences floods resulting from heavy rainfall of intensities exceeding 15 mm per hour associated with the Inter-tropical Convergence Zone (ITCZ). The generation of runoff is triggered by the rainfall intensity and soil moisture status. In this study, remote sensing and GIS techniques were used to analyze the hydrologic response to land cover changes. Runoff was calculated as a product of the net precipitation and a curve number coefficient. It was then routed using the Muskingum-Cunge method using a diffusive wave transfer model that enabled the calculation of response functions between start and end point. Flood frequency analysis was determined using theoretical probability distributions. Spatial data on land cover was obtained from multi-temporal Landsat images while data on rainfall, soil type, runoff and stream discharges was obtained by direct measurements in the field and from the Department of Water. A digital elevation model was generated from contour maps available at http://www.ngi.gov.za. The results showed that land cover changes had impacted negatively to the hydrology of the catchment. Peak discharges in the whole catchment were noted to have increased by at least 17% over the period while flood volumes were noted to have increased by at least 11% over the same period. The flood time to peak indicated a decreasing trend, in the range of 0.5 to 1 hour within the years. The synergism between remotely sensed digital data and GIS for land surface analysis and modeling was realized, and it was therefore concluded that hydrologic modeling has potential for determining the influence of changes in land cover on the hydrologic response of the catchment.Keywords: catchment, digital elevation model, hydrological model, routing, runoff
Procedia PDF Downloads 566611 Collective Potential: A Network of Acupuncture Interventions for Flood Resilience
Authors: Sachini Wickramanayaka
Abstract:
The occurrence of natural disasters has increased in an alarming rate in recent times due to escalating effects of climate change. One such natural disaster that has continued to grow in frequency and intensity is ‘flooding’, adversely affecting communities around the globe. This is an exploration on how architecture can intervene and facilitate in preserving communities in the face of disaster, specifically in battling floods. ‘Resilience’ is one of the concepts that have been brought forward to be instilled in vulnerable communities to lower the impact from such disasters as a preventative and coping mechanism. While there are number of ways to achieve resilience in the built environment, this paper aims to create a synthesis between resilience and ‘urban acupuncture’. It will consider strengthening communities from within, by layering a network of relatively small-scale, fast phased interventions on pre-existing conventional flood preventative large-scale engineering infrastructure.By investigating ‘The Woodlands’, a planned neighborhood as a case study, this paper will argue that large-scale water management solutions while extremely important will not suffice as a single solution particularly during a time of frequent and extreme weather events. The different projects will try to synthesize non-architectural aspects such as neighborhood aspirations, requirements, potential and awareness into a network of architectural forms that would collectively increase neighborhood resiliency to floods. A mapping study of the selected study area will identify the problematic areas that flood in the neighborhood while the empirical data from previously implemented case studies will assess the success of each solution.If successful the different solutions for each of the identified problem areas will exhibithow flooding and water management can be integrated as part and parcel of daily life.Keywords: acupuncture, architecture, resiliency, micro-interventions, neighborhood
Procedia PDF Downloads 170610 Low-Impact Development Strategies Assessment for Urban Design
Abstract:
Climate change and land-use change caused by urban expansion increase the frequency of urban flooding. To mitigate the increase in runoff volume, low-impact development (LID) is a green approach for reducing the area of impervious surface and managing stormwater at the source with decentralized micro-scale control measures. However, the current benefit assessment and practical application of LID in Taiwan is still tending to be development plan in the community and building site scales. As for urban design, site-based moisture-holding capacity has been common index for evaluating LID’s effectiveness of urban design, which ignore the diversity, and complexity of the urban built environments, such as different densities, positive and negative spaces, volumes of building and so on. Such inflexible regulations not only probably make difficulty for most of the developed areas to implement, but also not suitable for every different types of built environments, make little benefits to some types of built environments. Looking toward to enable LID to strength the link with urban design to reduce the runoff in coping urban flooding, the research consider different characteristics of different types of built environments in developing LID strategy. Classify the built environments by doing the cluster analysis based on density measures, such as Ground Space Index (GSI), Floor Space Index (FSI), Floors (L), and Open Space Ratio (OSR), and analyze their impervious surface rates and runoff volumes. Simulate flood situations by using quasi-two-dimensional flood plain flow model, and evaluate the flood mitigation effectiveness of different types of built environments in different low-impact development strategies. The information from the results of the assessment can be more precisely implement in urban design. In addition, it helps to enact regulations of low-Impact development strategies in urban design more suitable for every different type of built environments.Keywords: low-impact development, urban design, flooding, density measures
Procedia PDF Downloads 334609 Bayesian Value at Risk Forecast Using Realized Conditional Autoregressive Expectiel Mdodel with an Application of Cryptocurrency
Authors: Niya Chen, Jennifer Chan
Abstract:
In the financial market, risk management helps to minimize potential loss and maximize profit. There are two ways to assess risks; the first way is to calculate the risk directly based on the volatility. The most common risk measurements are Value at Risk (VaR), sharp ratio, and beta. Alternatively, we could look at the quantile of the return to assess the risk. Popular return models such as GARCH and stochastic volatility (SV) focus on modeling the mean of the return distribution via capturing the volatility dynamics; however, the quantile/expectile method will give us an idea of the distribution with the extreme return value. It will allow us to forecast VaR using return which is direct information. The advantage of using these non-parametric methods is that it is not bounded by the distribution assumptions from the parametric method. But the difference between them is that expectile uses a second-order loss function while quantile regression uses a first-order loss function. We consider several quantile functions, different volatility measures, and estimates from some volatility models. To estimate the expectile of the model, we use Realized Conditional Autoregressive Expectile (CARE) model with the bayesian method to achieve this. We would like to see if our proposed models outperform existing models in cryptocurrency, and we will test it by using Bitcoin mainly as well as Ethereum.Keywords: expectile, CARE Model, CARR Model, quantile, cryptocurrency, Value at Risk
Procedia PDF Downloads 109608 The Implication of Disaster Risk Identification to Cultural Heritage-The Scenarios of Flood Risk in Taiwan
Authors: Jieh-Jiuh Wang
Abstract:
Disasters happen frequently due to the global climate changes today. The cultural heritage conservation should be considered from the perspectives of surrounding environments and large-scale disasters. Most current thoughts about the disaster prevention of cultural heritages in Taiwan are single-point thoughts emphasizing firefighting, decay prevention, and construction reinforcement and ignoring the whole concept of the environment. The traditional conservation cannot defend against more and more tremendous and frequent natural disasters caused by climate changes. More and more cultural heritages are confronting the high risk of disasters. This study adopts the perspective of risk identification and takes flood as the main disaster category. It analyzes the amount and categories of cultural heritages that might suffer from disasters with the geographic information system integrating the latest flooding potential data from National Fire Agency and Water Resources Agency and the basic data of cultural heritages. It examines the actual risk of cultural heritages confronting floods and serves as the accordance for future considerations of risk measures and preparation for reducing disasters. The result of the study finds the positive relationship between the disaster affected situation of national cultural heritages and the rainfall intensity. The order of impacted level by floods is historical buildings, historical sites indicated by municipalities and counties, and national historical sites and relics. However, traditional settlements and cultural landscapes are not impacted. It might be related to the taboo space in the traditional culture of site selection (concepts of disaster avoidance). As for the regional distribution on the other hand, cultural heritages in central and northern Taiwan suffer from more shocking floods, while the heritages in northern and eastern Taiwan suffer from more serious flooding depth.Keywords: cultural heritage, flood, preventive conservation, risk management
Procedia PDF Downloads 338607 Rainfall–Runoff Simulation Using WetSpa Model in Golestan Dam Basin, Iran
Authors: M. R. Dahmardeh Ghaleno, M. Nohtani, S. Khaledi
Abstract:
Flood simulation and prediction is one of the most active research areas in surface water management. WetSpa is a distributed, continuous, and physical model with daily or hourly time step that explains precipitation, runoff, and evapotranspiration processes for both simple and complex contexts. This model uses a modified rational method for runoff calculation. In this model, runoff is routed along the flow path using Diffusion-Wave equation which depends on the slope, velocity, and flow route characteristics. Golestan Dam Basin is located in Golestan province in Iran and it is passing over coordinates 55° 16´ 50" to 56° 4´ 25" E and 37° 19´ 39" to 37° 49´ 28"N. The area of the catchment is about 224 km2, and elevations in the catchment range from 414 to 2856 m at the outlet, with average slope of 29.78%. Results of the simulations show a good agreement between calculated and measured hydrographs at the outlet of the basin. Drawing upon Nash-Sutcliffe model efficiency coefficient for calibration periodic model estimated daily hydrographs and maximum flow rate with an accuracy up to 59% and 80.18%, respectively.Keywords: watershed simulation, WetSpa, stream flow, flood prediction
Procedia PDF Downloads 244606 Surveying Coastal Society Perception on Giant Sea Wall Jakarta Development Planning
Authors: Ammar Asfari, Faizah Finur Fithriah, Shighia Ajeng Savitri
Abstract:
Jakarta as the capital city of Indonesia held an important role for the country, that is being the city where central government is located. But its topographic character which categorized as lowland area is causing an ultimate trouble. With average height of 7 meters above the sea level, flood keeps occurring in this city. On the other hand, water exploitation that caused land subsidence and sea-levels increasing by global warming make it even worse. Giant Sea Wall Development is a project created by Jakarta’s government to overcome flood, which is inspired by Saemangeum Dam in South Korea. For further planning, Giant Sea Wall is planned to be water reservoir for Jakarta’s inhabitants. This research’s aim is to fully understand the knowledge and opinion of people living in North Jakarta (Jakarta’s Coastal Area) on Giant Sea Wall development planning using qualitative method analysis with descriptive approach. The result of this research will be one of the determining factors in Giant Sea Wall Jakarta development planning continuance.Keywords: descriptive approach, Giant Sea Wall Jakarta, qualitative method analysis, society perception
Procedia PDF Downloads 283605 Forecasting 24-Hour Ahead Electricity Load Using Time Series Models
Authors: Ramin Vafadary, Maryam Khanbaghi
Abstract:
Forecasting electricity load is important for various purposes like planning, operation, and control. Forecasts can save operating and maintenance costs, increase the reliability of power supply and delivery systems, and correct decisions for future development. This paper compares various time series methods to forecast 24 hours ahead of electricity load. The methods considered are the Holt-Winters smoothing, SARIMA Modeling, LSTM Network, Fbprophet, and Tensorflow probability. The performance of each method is evaluated by using the forecasting accuracy criteria, namely, the mean absolute error and root mean square error. The National Renewable Energy Laboratory (NREL) residential energy consumption data is used to train the models. The results of this study show that the SARIMA model is superior to the others for 24 hours ahead forecasts. Furthermore, a Bagging technique is used to make the predictions more robust. The obtained results show that by Bagging multiple time-series forecasts, we can improve the robustness of the models for 24 hours ahead of electricity load forecasting.Keywords: bagging, Fbprophet, Holt-Winters, LSTM, load forecast, SARIMA, TensorFlow probability, time series
Procedia PDF Downloads 95604 Application of Stochastic Models to Annual Extreme Streamflow Data
Authors: Karim Hamidi Machekposhti, Hossein Sedghi
Abstract:
This study was designed to find the best stochastic model (using of time series analysis) for annual extreme streamflow (peak and maximum streamflow) of Karkheh River at Iran. The Auto-regressive Integrated Moving Average (ARIMA) model used to simulate these series and forecast those in future. For the analysis, annual extreme streamflow data of Jelogir Majin station (above of Karkheh dam reservoir) for the years 1958–2005 were used. A visual inspection of the time plot gives a little increasing trend; therefore, series is not stationary. The stationarity observed in Auto-Correlation Function (ACF) and Partial Auto-Correlation Function (PACF) plots of annual extreme streamflow was removed using first order differencing (d=1) in order to the development of the ARIMA model. Interestingly, the ARIMA(4,1,1) model developed was found to be most suitable for simulating annual extreme streamflow for Karkheh River. The model was found to be appropriate to forecast ten years of annual extreme streamflow and assist decision makers to establish priorities for water demand. The Statistical Analysis System (SAS) and Statistical Package for the Social Sciences (SPSS) codes were used to determinate of the best model for this series.Keywords: stochastic models, ARIMA, extreme streamflow, Karkheh river
Procedia PDF Downloads 148603 Case Studies of Mitigation Methods against the Impacts of High Water Levels in the Great Lakes
Authors: Jennifer M. Penton
Abstract:
Record high lake levels in 2017 and 2019 (2017 max lake level = 75.81 m; 2018 max lake level = 75.26 m; 2019 max lake level = 75.92 m) combined with a number of severe storms in the Great Lakes region, have resulted in significant wave generation across Lake Ontario. The resulting large wave heights have led to erosion of the natural shoreline, overtopping of existing revetments, backshore erosion, and partial and complete failure of several coastal structures, which in turn have led to further erosion of the shoreline and damaged existing infrastructure. Such impacts can be seen all along the coast of Lake Ontario. Three specific locations have been chosen as case studies for this paper, each addressing erosion and/or flood mitigation methods, such as revetments and sheet piling with increased land levels. Varying site conditions and the resulting shoreline damage are compared herein. The results are reflected in the case-specific design components of the mitigation and adaptation methods and are presented in this paper.Keywords: erosion mitigation, flood mitigation, great lakes, high water levels
Procedia PDF Downloads 173602 Bathymetric Change of Brahmaputra River and Its Influence on Flooding Scenario
Authors: Arup Kumar Sarma, Rohan Kar
Abstract:
The development of physical model of River like Brahmaputra, which finds its origin in the Chema Yundung glacier of Tibet and flows through India and Bangladesh, is always expensive and very much time consuming. With the advancement of computational technique, mathematical modeling has found wide application. MIKE 21C is one such commercial software, developed by Danish Hydraulic Institute (DHI), with the depth-averaged approach and a two-dimensional curvilinear finite-difference model, which is capable of modeling hydrodynamic and morphological processes with some limitations. The main purpose of this study are to generate bathymetry of the River Brahmaputra starting from “Sadia” at upstream to “Dhubri,” at downstream stretching a distance of approximately 695 km, for four different years: 1957, 1971, 1977, and 1981 over the grid generated in the MIKE 21C and to carry out the hydrodynamic simulation for these years to analyze the effect of bathymetry change on the surface water elevation. The study has established that bathymetric change can influence the flood level significantly in some of the river reaches and therefore the modification or updating of regular bathymetry is very much essential for the reliable flood routing in alluvial rivers.Keywords: bathymetry, brahmaputra river, hydrodynamic model, surface water elevation
Procedia PDF Downloads 455601 Forecasting Future Society to Explore Promising Security Technologies
Authors: Jeonghwan Jeon, Mintak Han, Youngjun Kim
Abstract:
Due to the rapid development of information and communication technology (ICT), a substantial transformation is currently happening in the society. As the range of intelligent technologies and services is continuously expanding, ‘things’ are becoming capable of communicating one another and even with people. However, such “Internet of Things” has the technical weakness so that a great amount of such information transferred in real-time may be widely exposed to the threat of security. User’s personal data are a typical example which is faced with a serious security threat. The threats of security will be diversified and arose more frequently because next generation of unfamiliar technology develops. Moreover, as the society is becoming increasingly complex, security vulnerability will be increased as well. In the existing literature, a considerable number of private and public reports that forecast future society have been published as a precedent step of the selection of future technology and the establishment of strategies for competitiveness. Although there are previous studies that forecast security technology, they have focused only on technical issues and overlooked the interrelationships between security technology and social factors are. Therefore, investigations of security threats in the future and security technology that is able to protect people from various threats are required. In response, this study aims to derive potential security threats associated with the development of technology and to explore the security technology that can protect against them. To do this, first of all, private and public reports that forecast future and online documents from technology-related communities are collected. By analyzing the data, future issues are extracted and categorized in terms of STEEP (Society, Technology, Economy, Environment, and Politics), as well as security. Second, the components of potential security threats are developed based on classified future issues. Then, points that the security threats may occur –for example, mobile payment system based on a finger scan technology– are identified. Lastly, alternatives that prevent potential security threats are proposed by matching security threats with points and investigating related security technologies from patent data. Proposed approach can identify the ICT-related latent security menaces and provide the guidelines in the ‘problem – alternative’ form by linking the threat point with security technologies.Keywords: future society, information and communication technology, security technology, technology forecasting
Procedia PDF Downloads 468600 Evaluation of Best-Fit Probability Distribution for Prediction of Extreme Hydrologic Phenomena
Authors: Karim Hamidi Machekposhti, Hossein Sedghi
Abstract:
The probability distributions are the best method for forecasting of extreme hydrologic phenomena such as rainfall and flood flows. In this research, in order to determine suitable probability distribution for estimating of annual extreme rainfall and flood flows (discharge) series with different return periods, precipitation with 40 and discharge with 58 years time period had been collected from Karkheh River at Iran. After homogeneity and adequacy tests, data have been analyzed by Stormwater Management and Design Aid (SMADA) software and residual sum of squares (R.S.S). The best probability distribution was Log Pearson Type III with R.S.S value (145.91) and value (13.67) for peak discharge and Log Pearson Type III with R.S.S values (141.08) and (8.95) for maximum discharge in Jelogir Majin and Pole Zal stations, respectively. The best distribution for maximum precipitation in Jelogir Majin and Pole Zal stations was Log Pearson Type III distribution with R.S.S values (1.74&1.90) and then Pearson Type III distribution with R.S.S values (1.53&1.69). Overall, the Log Pearson Type III distributions are acceptable distribution types for representing statistics of extreme hydrologic phenomena in Karkheh River at Iran with the Pearson Type III distribution as a potential alternative.Keywords: Karkheh River, Log Pearson Type III, probability distribution, residual sum of squares
Procedia PDF Downloads 197599 Taleghan Dam Break Numerical Modeling
Authors: Hamid Goharnejad, Milad Sadeghpoor Moalem, Mahmood Zakeri Niri, Leili Sadeghi Khalegh Abadi
Abstract:
While there are many benefits to using reservoir dams, their break leads to destructive effects. From the viewpoint of International Committee of Large Dams (ICOLD), dam break means the collapse of whole or some parts of a dam; thereby the dam will be unable to hold water. Therefore, studying dam break phenomenon and prediction of its behavior and effects reduces losses and damages of the mentioned phenomenon. One of the most common types of reservoir dams is embankment dam. Overtopping in embankment dams occurs because of flood discharge system inability in release inflows to reservoir. One of the most important issues among managers and engineers to evaluate the performance of the reservoir dam rim when sliding into the storage, creating waves is large and long. In this study, the effects of floods which caused the overtopping of the dam have been investigated. It was assumed that spillway is unable to release the inflow. To determine outflow hydrograph resulting from dam break, numerical model using Flow-3D software and empirical equations was used. Results of numerical models and their comparison with empirical equations show that numerical model and empirical equations can be used to study the flood resulting from dam break.Keywords: embankment dam break, empirical equations, Taleghan dam, Flow-3D numerical model
Procedia PDF Downloads 321598 Estimation of the Parameters of Muskingum Methods for the Prediction of the Flood Depth in the Moudjar River Catchment
Authors: Fares Laouacheria, Said Kechida, Moncef Chabi
Abstract:
The objective of the study was based on the hydrological routing modelling for the continuous monitoring of the hydrological situation in the Moudjar river catchment, especially during floods with Hydrologic Engineering Center–Hydrologic Modelling Systems (HEC-HMS). The HEC-GeoHMS was used to transform data from geographic information system (GIS) to HEC-HMS for delineating and modelling the catchment river in order to estimate the runoff volume, which is used as inputs to the hydrological routing model. Two hydrological routing models were used, namely Muskingum and Muskingum routing models, for conducting this study. In this study, a comparison between the parameters of the Muskingum and Muskingum-Cunge routing models in HEC-HMS was used for modelling flood routing in the Moudjar river catchment and determining the relationship between these parameters and the physical characteristics of the river. The results indicate that the effects of input parameters such as the weighting factor "X" and travel time "K" on the output results are more significant, where the Muskingum routing model was more sensitive to input parameters than the Muskingum-Cunge routing model. This study can contribute to understand and improve the knowledge of the mechanisms of river floods, especially in ungauged river catchments.Keywords: HEC-HMS, hydrological modelling, Muskingum routing model, Muskingum-Cunge routing model
Procedia PDF Downloads 278597 Predicting Relative Performance of Sector Exchange Traded Funds Using Machine Learning
Abstract:
Machine learning has been used in many areas today. It thrives at reviewing large volumes of data and identifying patterns and trends that might not be apparent to a human. Given the huge potential benefit and the amount of data available in the financial market, it is not surprising to see machine learning applied to various financial products. While future prices of financial securities are extremely difficult to forecast, we study them from a different angle. Instead of trying to forecast future prices, we apply machine learning algorithms to predict the direction of future price movement, in particular, whether a sector Exchange Traded Fund (ETF) would outperform or underperform the market in the next week or in the next month. We apply several machine learning algorithms for this prediction. The algorithms are Linear Discriminant Analysis (LDA), k-Nearest Neighbors (KNN), Decision Tree (DT), Gaussian Naive Bayes (GNB), and Neural Networks (NN). We show that these machine learning algorithms, most notably GNB and NN, have some predictive power in forecasting out-performance and under-performance out of sample. We also try to explore whether it is possible to utilize the predictions from these algorithms to outperform the buy-and-hold strategy of the S&P 500 index. The trading strategy to explore out-performance predictions does not perform very well, but the trading strategy to explore under-performance predictions can earn higher returns than simply holding the S&P 500 index out of sample.Keywords: machine learning, ETF prediction, dynamic trading, asset allocation
Procedia PDF Downloads 98