Search results for: modeling reliability
5318 Integrated Modeling Approach for Energy Planning and Climate Change Mitigation Assessment in the State of Florida
Authors: K. Thakkar, C. Ghenai
Abstract:
An integrated modeling approach was used in this study to (1) track energy consumption, production, and resource extraction, (2) track greenhouse gases emissions and (3) analyze emissions for local and regional air pollutions. The model was used in this study for short and long term energy and GHG emissions reduction analysis for the state of Florida. The integrated modeling methodology will help to evaluate the alternative energy scenarios and examine emissions-reduction strategies. The mitigation scenarios have been designed to describe the future energy strategies. They consist of various demand and supply side scenarios. One of the GHG mitigation scenarios is crafted by taking into account the available renewable resources potential for power generation in the state of Florida to compare and analyze the GHG reduction measure against ‘Business As Usual’ and ‘Florida State Policy’ scenario. Two more ‘integrated’ scenarios, (‘Electrification’ and ‘Efficiency and Lifestyle’) are crafted through combination of various mitigation scenarios to assess the cumulative impact of the reduction measures such as technological changes and energy efficiency and conservation.Keywords: energy planning, climate change mitigation assessment, integrated modeling approach, energy alternatives, and GHG emission reductions
Procedia PDF Downloads 4435317 An Integrated Framework for Engaging Stakeholders in the Circular Economy Processes Using Building Information Modeling and Virtual Reality
Authors: Erisasadat Sahebzamani, Núria Forcada, Francisco Lendinez
Abstract:
Global climate change has become increasingly problematic over the past few decades. The construction industry has contributed to greenhouse gas emissions in recent decades. Considering these issues and the high demand for materials in the construction industry, Circular Economy (CE) is considered necessary to keep materials in the loop and extend their useful lives. By providing tangible benefits, Construction 4.0 facilitates the adoption of CE by reducing waste, updating standard work, sharing knowledge, and increasing transparency and stability. This study aims to present a framework for integrating CE and digital tools like Building Information Modeling (BIM) and Virtual Reality (VR) to examine the impact on the construction industry based on stakeholders' perspectives.Keywords: circular economy, building information modeling, virtual reality, stakeholder engagement
Procedia PDF Downloads 1115316 Modeling and Optimization of Nanogenerator for Energy Harvesting
Authors: Fawzi Srairi, Abderrahmane Dib
Abstract:
Recently, the desire for a self-powered micro and nanodevices has attracted a great interest of using sustainable energy sources. Further, the ultimate goal of nanogenerator is to harvest energy from the ambient environment in which a self-powered device based on these generators is needed. With the development of nanogenerator-based circuits design and optimization, the building of new device simulator is necessary for the study and the synthesis of electromechanical parameters of this type of models. In the present article, both numerical modeling and optimization of piezoelectric nanogenerator based on zinc oxide have been carried out. They aim to improve the electromechanical performances, robustness, and synthesis process for nanogenerator. The proposed model has been developed for a systematic study of the nanowire morphology parameters in stretching mode. In addition, heuristic optimization technique, namely, particle swarm optimization has been implemented for an analytic modeling and an optimization of nanogenerator-based process in stretching mode. Moreover, the obtained results have been tested and compared with conventional model where a good agreement has been obtained for excitation mode. The developed nanogenerator model can be generalized, extended and integrated into simulators devices to study nanogenerator-based circuits.Keywords: electrical potential, heuristic algorithms, numerical modeling, nanogenerator
Procedia PDF Downloads 3085315 Autonomous Taxiing Robot for Grid Resilience Enhancement in Green Airport
Authors: Adedayo Ajayi, Patrick Luk, Liyun Lao
Abstract:
This paper studies the supportive needs for the electrical infrastructure of the green airport. In particular, the core objective revolves around the choice of electric grid configuration required to meet the expected electrified loads, i.e., the taxiing and charging loads of hybrid /pure electric aircraft in the airport. Further, reliability and resilience are critical aspects of a newly proposed grid; the concept of mobile energy storage as energy as a service (EAAS) for grid support in the proposed green airport is investigated using an autonomous electric taxiing robot (A-ETR) at a case study (Cranfield Airport). The performance of the model is verified and validated through DigSILENT power factory simulation software to compare the networks in terms of power quality, short circuit fault levels, system voltage profile, and power losses. Contingency and reliability index analysis are further carried out to show the potential of EAAS on the grid. The results demonstrate that the low voltage a.c network ( LVAC) architecture gives better performance with adequate compensation than the low voltage d.c (LVDC) microgrid architecture for future green airport electrification integration. And A-ETR can deliver energy as a service (EaaS) to improve the airport's electrical power system resilience and energy supply.Keywords: reliability, voltage profile, flightpath 2050, green airport
Procedia PDF Downloads 815314 Variability of Hydrological Modeling of the Blue Nile
Authors: Abeer Samy, Oliver C. Saavedra Valeriano, Abdelazim Negm
Abstract:
The Blue Nile Basin is the most important tributary of the Nile River. Egypt and Sudan are almost dependent on water originated from the Blue Nile. This multi-dependency creates conflicts among the three countries Egypt, Sudan, and Ethiopia making the management of these conflicts as an international issue. Good assessment of the water resources of the Blue Nile is an important to help in managing such conflicts. Hydrological models are good tool for such assessment. This paper presents a critical review of the nature and variability of the climate and hydrology of the Blue Nile Basin as a first step of using hydrological modeling to assess the water resources of the Blue Nile. Many several attempts are done to develop basin-scale hydrological modeling on the Blue Nile. Lumped and semi distributed models used averages of meteorological inputs and watershed characteristics in hydrological simulation, to analyze runoff for flood control and water resource management. Distributed models include the temporal and spatial variability of catchment conditions and meteorological inputs to allow better representation of the hydrological process. The main challenge of all used models was to assess the water resources of the basin is the shortage of the data needed for models calibration and validation. It is recommended to use distributed model for their higher accuracy to cope with the great variability and complexity of the Blue Nile basin and to collect sufficient data to have more sophisticated and accurate hydrological modeling.Keywords: Blue Nile Basin, climate change, hydrological modeling, watershed
Procedia PDF Downloads 3665313 A Quantitative Survey Research on the Development and Assessment of Attitude toward Mathematics Instrument
Authors: Soofia Malik
Abstract:
The purpose of this study is to develop an instrument to measure undergraduate students’ attitudes toward mathematics (MAT) and to assess the data collected from the instrument for validity and reliability. The instrument is developed using five subscales: anxiety, enjoyment, self-confidence, value, and technology. The technology dimension is added as the fifth subscale of attitude toward mathematics because of the recent trend of incorporating online homework in mathematics courses as well as due to heavy reliance of higher education on using online learning management systems, such as Blackboard and Moodle. The sample consists of 163 (M = 82, F = 81) undergraduates enrolled in College Algebra course in the summer 2017 semester at a university in the USA. The data is analyzed to answer the research question: if and how do undergraduate students’ attitudes toward mathematics load using Principal Components Analysis (PCA)? As a result of PCA, three subscales emerged namely: anxiety/self-confidence scale, enjoyment, and value scale. After deleting the last five items or the last two subscales from the initial MAT scale, the Cronbach’s alpha was recalculated using the scores from 20 items and was found to be α = .95. It is important to note that the reliability of the initial MAT form was α = .93. This means that employing the final MAT survey form would yield consistent results in repeated uses. The final MAT form is, therefore, more reliable as compared to the initial MAT form.Keywords: college algebra, Cronbach's alpha reliability coefficient, Principal Components Analysis, PCA, technology in mathematics
Procedia PDF Downloads 1235312 R Statistical Software Applied in Reliability Analysis: Case Study of Diesel Generator Fans
Authors: Jelena Vucicevic
Abstract:
Reliability analysis represents a very important task in different areas of work. In any industry, this is crucial for maintenance, efficiency, safety and monetary costs. There are ways to calculate reliability, unreliability, failure density and failure rate. This paper will try to introduce another way of calculating reliability by using R statistical software. R is a free software environment for statistical computing and graphics. It compiles and runs on a wide variety of UNIX platforms, Windows and MacOS. The R programming environment is a widely used open source system for statistical analysis and statistical programming. It includes thousands of functions for the implementation of both standard and new statistical methods. R does not limit user only to operation related only to these functions. This program has many benefits over other similar programs: it is free and, as an open source, constantly updated; it has built-in help system; the R language is easy to extend with user-written functions. The significance of the work is calculation of time to failure or reliability in a new way, using statistic. Another advantage of this calculation is that there is no need for technical details and it can be implemented in any part for which we need to know time to fail in order to have appropriate maintenance, but also to maximize usage and minimize costs. In this case, calculations have been made on diesel generator fans but the same principle can be applied to any other part. The data for this paper came from a field engineering study of the time to failure of diesel generator fans. The ultimate goal was to decide whether or not to replace the working fans with a higher quality fan to prevent future failures. Seventy generators were studied. For each one, the number of hours of running time from its first being put into service until fan failure or until the end of the study (whichever came first) was recorded. Dataset consists of two variables: hours and status. Hours show the time of each fan working and status shows the event: 1- failed, 0- censored data. Censored data represent cases when we cannot track the specific case, so it could fail or success. Gaining the result by using R was easy and quick. The program will take into consideration censored data and include this into the results. This is not so easy in hand calculation. For the purpose of the paper results from R program have been compared to hand calculations in two different cases: censored data taken as a failure and censored data taken as a success. In all three cases, results are significantly different. If user decides to use the R for further calculations, it will give more precise results with work on censored data than the hand calculation.Keywords: censored data, R statistical software, reliability analysis, time to failure
Procedia PDF Downloads 4015311 Psychometric Properties of the Sensory Processing Measure Preschool-Home among Children with Autism in Saudi Arabia
Authors: Shahad Alkhalifah, Jonh Wright
Abstract:
Autism spectrum disorder (ASD) is a pervasive developmental disorder associated, for 42% to 88% of people with ASD, with sensory processing disorders. Sensory processing disorders (SPD) impact daily functioning, and it is, therefore, essential to be able to diagnose them accurately. Currently, however, there is no assessment tool available for the Saudi Arabia (SA) population that would cover a wider enough age range. Therefore, this study aimed to assess the psychometric properties of the Sensory Processing Measure Preschool-Home Form (SPM-P) when used in English, with a population of English-speaking Saudi participants. This was chosen due to time limitations and the urgency in providing practitioners with appropriate tools. Using a convenience sampling approach group of caregivers of typically developing (TD) children and a group of caregivers for children with ASD were recruited (N = 40 and N = 16, respectively), and completed the SPM-P Home Form. Participants were also invited to complete it again after two weeks for test-retest reliability, and respectively, nine and five agreed. Reliability analyses suggested some issues with a few items when used in the Saudi culture, and, along with interscale correlations, it highlighted concerns with the factor structure. However, it was also found that the SPM-P Home has good criterion-based validity, and it is, therefore, suggested that it can be used until a tool is developed through translation and cultural adaptation. It is also suggested that the current factor structure of SPM-P Home is reassessed using a large sample.Keywords: autism, sensory, assessment, reliability, sensory processing dysfunction, preschool, validity
Procedia PDF Downloads 2305310 Drying Modeling of Banana Using Cellular Automata
Authors: M. Fathi, Z. Farhaninejad, M. Shahedi, M. Sadeghi
Abstract:
Drying is one of the oldest preservation methods for food and agriculture products. Appropriate control of operation can be obtained by modeling. Limitation of continues models for complex boundary condition and non-regular geometries leading to appearance of discrete novel methods such as cellular automata, which provides a platform for obtaining fast predictions by rule-based mathematics. In this research a one D dimensional CA was used for simulating thin layer drying of banana. Banana slices were dried with a convectional air dryer and experimental data were recorded for validating of final model. The model was programmed by MATLAB, run for 70000 iterations and von-Neumann neighborhood. The validation results showed a good accordance between experimental and predicted data (R=0.99). Cellular automata are capable to reproduce the expected pattern of drying and have a powerful potential for solving physical problems with reasonable accuracy and low calculating resources.Keywords: banana, cellular automata, drying, modeling
Procedia PDF Downloads 4385309 Generalized Additive Model Approach for the Chilean Hake Population in a Bio-Economic Context
Authors: Selin Guney, Andres Riquelme
Abstract:
The traditional bio-economic method for fisheries modeling uses some estimate of the growth parameters and the system carrying capacity from a biological model for the population dynamics (usually a logistic population growth model) which is then analyzed as a traditional production function. The stock dynamic is transformed into a revenue function and then compared with the extraction costs to estimate the maximum economic yield. In this paper, the logistic population growth model for the population is combined with a forecast of the abundance and location of the stock by using a generalized additive model approach. The paper focuses on the Chilean hake population. This method allows for the incorporation of climatic variables and the interaction with other marine species, which in turn will increase the reliability of the estimates and generate better extraction paths for different conservation objectives, such as the maximum biological yield or the maximum economic yield.Keywords: bio-economic, fisheries, GAM, production
Procedia PDF Downloads 2525308 Investigations on the Application of Avalanche Simulations: A Survey Conducted among Avalanche Experts
Authors: Korbinian Schmidtner, Rudolf Sailer, Perry Bartelt, Wolfgang Fellin, Jan-Thomas Fischer, Matthias Granig
Abstract:
This study focuses on the evaluation of snow avalanche simulations, based on a survey that has been carried out among avalanche experts. In the last decades, the application of avalanche simulation tools has gained recognition within the realm of hazard management. Traditionally, avalanche runout models were used to predict extreme avalanche runout and prepare avalanche maps. This has changed rather dramatically with the application of numerical models. For safety regulations such as road safety simulation tools are now being coupled with real-time meteorological measurements to predict frequent avalanche hazard. That places new demands on model accuracy and requires the simulation of physical processes that previously could be ignored. These simulation tools are based on a deterministic description of the avalanche movement allowing to predict certain quantities (e.g. pressure, velocities, flow heights, runout lengths etc.) of the avalanche flow. Because of the highly variable regimes of the flowing snow, no uniform rheological law describing the motion of an avalanche is known. Therefore, analogies to fluid dynamical laws of other materials are stated. To transfer these constitutional laws to snow flows, certain assumptions and adjustments have to be imposed. Besides these limitations, there exist high uncertainties regarding the initial and boundary conditions. Further challenges arise when implementing the underlying flow model equations into an algorithm executable by a computer. This implementation is constrained by the choice of adequate numerical methods and their computational feasibility. Hence, the model development is compelled to introduce further simplifications and the related uncertainties. In the light of these issues many questions arise on avalanche simulations, on their assets and drawbacks, on potentials for improvements as well as their application in practice. To address these questions a survey among experts in the field of avalanche science (e.g. researchers, practitioners, engineers) from various countries has been conducted. In the questionnaire, special attention is drawn on the expert’s opinion regarding the influence of certain variables on the simulation result, their uncertainty and the reliability of the results. Furthermore, it was tested to which degree a simulation result influences the decision making for a hazard assessment. A discrepancy could be found between a large uncertainty of the simulation input parameters as compared to a relatively high reliability of the results. This contradiction can be explained taking into account how the experts employ the simulations. The credibility of the simulations is the result of a rather thoroughly simulation study, where different assumptions are tested, comparing the results of different flow models along with the use of supplemental data such as chronicles, field observation, silent witnesses i.a. which are regarded as essential for the hazard assessment and for sanctioning simulation results. As the importance of avalanche simulations grows within the hazard management along with their further development studies focusing on the modeling fashion could contribute to a better understanding how knowledge of the avalanche process can be gained by running simulations.Keywords: expert interview, hazard management, modeling, simulation, snow avalanche
Procedia PDF Downloads 3265307 Equivalent Circuit Model for the Eddy Current Damping with Frequency-Dependence
Authors: Zhiguo Shi, Cheng Ning Loong, Jiazeng Shan, Weichao Wu
Abstract:
This study proposes an equivalent circuit model to simulate the eddy current damping force with shaking table tests and finite element modeling. The model is firstly proposed and applied to a simple eddy current damper, which is modelled in ANSYS, indicating that the proposed model can simulate the eddy current damping force under different types of excitations. Then, a non-contact and friction-free eddy current damper is designed and tested, and the proposed model can reproduce the experimental observations. The excellent agreement between the simulated results and the experimental data validates the accuracy and reliability of the equivalent circuit model. Furthermore, a more complicated model is performed in ANSYS to verify the feasibility of the equivalent circuit model in complex eddy current damper, and the higher-order fractional model and viscous model are adopted for comparison.Keywords: equivalent circuit model, eddy current damping, finite element model, shake table test
Procedia PDF Downloads 1915306 Naphtha Catalytic Reform: Modeling and Simulation of Unity
Authors: Leal Leonardo, Pires Carlos Augusto de Moraes, Casiraghi Magela
Abstract:
In this work were realized the modeling and simulation of the catalytic reformer process, of ample form, considering all the equipment that influence the operation performance. Considered it a semi-regenerative reformer, with four reactors in series intercalated with four furnaces, two heat exchanges, one product separator and one recycle compressor. A simplified reactional system was considered, involving only ten chemical compounds related through five reactions. The considered process was the applied to aromatics production (benzene, toluene, and xylene). The models developed to diverse equipment were interconnecting in a simulator that consists of a computer program elaborate in FORTRAN 77. The simulation of the global model representative of reformer unity achieved results that are compatibles with the literature ones. It was then possible to study the effects of operational variables in the products concentration and in the performance of the unity equipment.Keywords: catalytic reforming, modeling, simulation, petrochemical engineering
Procedia PDF Downloads 5155305 Preliminary Study of Human Reliability of Control in Case of Fire Based on the Decision Processes and Stress Model of Human in a Fire
Authors: Seung-Un Chae, Heung-Yul Kim, Sa-Kil Kim
Abstract:
This paper presents the findings of preliminary study on human control performance in case of fire. The relationship between human control and human decision is studied in decision processes and stress model of human in a fire. Human behavior aspects involved in the decision process during a fire incident. The decision processes appear that six of individual perceptual processes: recognition, validation, definition, evaluation, commitment, and reassessment. Then, human may be stressed in order to get an optimal decision for their activity. This paper explores problems in human control processes and stresses in a catastrophic situation. Thus, the future approach will be concerned to reduce stresses and ambiguous irrelevant information.Keywords: human reliability, decision processes, stress model, fire
Procedia PDF Downloads 9865304 Probabilistic-Based Design of Bridges under Multiple Hazards: Floods and Earthquakes
Authors: Kuo-Wei Liao, Jessica Gitomarsono
Abstract:
Bridge reliability against natural hazards such as floods or earthquakes is an interdisciplinary problem that involves a wide range of knowledge. Moreover, due to the global climate change, engineers have to design a structure against the multi-hazard threats. Currently, few of the practical design guideline has included such concept. The bridge foundation in Taiwan often does not have a uniform width. However, few of the researches have focused on safety evaluation of a bridge with a complex pier. Investigation of the scouring depth under such situation is very important. Thus, this study first focuses on investigating and improving the scour prediction formula for a bridge with complicated foundation via experiments and artificial intelligence. Secondly, a probabilistic design procedure is proposed using the established prediction formula for practical engineers under the multi-hazard attacks.Keywords: bridge, reliability, multi-hazards, scour
Procedia PDF Downloads 3745303 Sensitivity and Reliability Analysis of Masonry Infilled Frames
Authors: Avadhoot Bhosale, Robin Davis P., Pradip Sarkar
Abstract:
The seismic performance of buildings with irregular distribution of mass, stiffness and strength along the height may be significantly different from that of regular buildings with masonry infill. Masonry infilled reinforced concrete (RC) frames are very common structural forms used for multi-storey building construction. These structures are found to perform better in past earthquakes owing to additional strength, stiffness and energy dissipation in the infill walls. The seismic performance of a building depends on the variation of material, structural and geometrical properties. The sensitivity of these properties affects the seismic response of the building. The main objective of the sensitivity analysis is to found out the most sensitive parameter that affects the response of the building. This paper presents a sensitivity analysis by considering 5% and 95% probability value of random variable in the infills characteristics, trying to obtain a reasonable range of results representing a wide number of possible situations that can be met in practice by using pushover analysis. The results show that the strength-related variation values of concrete and masonry, with the exception of tensile strength of the concrete, have shown a significant effect on the structural performance and that this effect increases with the progress of damage condition for the concrete. The seismic risk assessments of the selected frames are expressed in terms of reliability index.Keywords: fragility curve, sensitivity analysis, reliability index, RC frames
Procedia PDF Downloads 3235302 Design and Implementation of Reliable Location-Based Social Community Services
Authors: B. J. Kim, K. W. Nam, S. J. Lee
Abstract:
Traditional social network services provide users with more information than is needed, and it is not easy to verify the authenticity of the information. This paper proposes a system that can only post messages where users are located to enhance the reliability of social networking services. The proposed system implements a Google Map API to post postings on the map and to read postings within a range of distances from the users’ location. The proposed system will only provide alerts, memories, and information about locations within a given range depending on the users' current location, providing reliable information that they believe will be necessary in real time. It is expected that the proposed system will be able to meet the real demands of users and create a more reliable social network services environment.Keywords: social network, location, reliability, posting
Procedia PDF Downloads 2575301 Modeling Binomial Dependent Distribution of the Values: Synthesis Tables of Probabilities of Errors of the First and Second Kind of Biometrics-Neural Network Authentication System
Authors: B. S.Akhmetov, S. T. Akhmetova, D. N. Nadeyev, V. Yu. Yegorov, V. V. Smogoonov
Abstract:
Estimated probabilities of errors of the first and second kind for nonideal biometrics-neural transducers 256 outputs, the construction of nomograms based error probability of 'own' and 'alien' from the mathematical expectation and standard deviation of the normalized measures Hamming.Keywords: modeling, errors, probability, biometrics, neural network, authentication
Procedia PDF Downloads 4825300 Modelling Causal Effects from Complex Longitudinal Data via Point Effects of Treatments
Authors: Xiaoqin Wang, Li Yin
Abstract:
Background and purpose: In many practices, one estimates causal effects arising from a complex stochastic process, where a sequence of treatments are assigned to influence a certain outcome of interest, and there exist time-dependent covariates between treatments. When covariates are plentiful and/or continuous, statistical modeling is needed to reduce the huge dimensionality of the problem and allow for the estimation of causal effects. Recently, Wang and Yin (Annals of statistics, 2020) derived a new general formula, which expresses these causal effects in terms of the point effects of treatments in single-point causal inference. As a result, it is possible to conduct the modeling via point effects. The purpose of the work is to study the modeling of these causal effects via point effects. Challenges and solutions: The time-dependent covariates often have influences from earlier treatments as well as on subsequent treatments. Consequently, the standard parameters – i.e., the mean of the outcome given all treatments and covariates-- are essentially all different (null paradox). Furthermore, the dimension of the parameters is huge (curse of dimensionality). Therefore, it can be difficult to conduct the modeling in terms of standard parameters. Instead of standard parameters, we have use point effects of treatments to develop likelihood-based parametric approach to the modeling of these causal effects and are able to model the causal effects of a sequence of treatments by modeling a small number of point effects of individual treatment Achievements: We are able to conduct the modeling of the causal effects from a sequence of treatments in the familiar framework of single-point causal inference. The simulation shows that our method achieves not only an unbiased estimate for the causal effect but also the nominal level of type I error and a low level of type II error for the hypothesis testing. We have applied this method to a longitudinal study of COVID-19 mortality among Scandinavian countries and found that the Swedish approach performed far worse than the other countries' approach for COVID-19 mortality and the poor performance was largely due to its early measure during the initial period of the pandemic.Keywords: causal effect, point effect, statistical modelling, sequential causal inference
Procedia PDF Downloads 2055299 The Impact of Leadership Culture on Motivation, Efficiency, and Performance of Customs Employees: A Case Study of Iran Customs
Authors: Kazem Samadi
Abstract:
In today’s world, public agencies like customs have become vital institutions in international trade processes and in maintaining national economic security due to increasing economic and commercial complexities. In this regard, human resource management (HRM) is crucial to achieving organizational goals. This research employed a descriptive survey method, in which the statistical population consisted of all customs employees. Using Cochran's formula, 300 employees were selected from the central customs office. A researcher-made questionnaire was used as the data collection tool, with content validity and reliability confirmed using Cronbach's alpha coefficient. The collected data were analyzed through structural modeling using SPSS and AMOS 24. The results indicated that leadership culture significantly affected employee motivation, efficiency, and performance in customs. Customs managers and leaders in Iran can improve organizational productivity by fostering this culture, thereby facilitating individual and organizational development for their staff.Keywords: leadership culture, organizational culture, employee performance, customs
Procedia PDF Downloads 195298 Stochastic Richelieu River Flood Modeling and Comparison of Flood Propagation Models: WMS (1D) and SRH (2D)
Authors: Maryam Safrai, Tewfik Mahdi
Abstract:
This article presents the stochastic modeling of the Richelieu River flood in Quebec, Canada, occurred in the spring of 2011. With the aid of the one-dimensional Watershed Modeling System (WMS (v.10.1) and HEC-RAS (v.4.1) as a flood simulator, the delineation of the probabilistic flooded areas was considered. Based on the Monte Carlo method, WMS (v.10.1) delineated the probabilistic flooded areas with corresponding occurrence percentages. Furthermore, results of this one-dimensional model were compared with the results of two-dimensional model (SRH-2D) for the evaluation of efficiency and precision of each applied model. Based on this comparison, computational process in two-dimensional model is longer and more complicated versus brief one-dimensional one. Although, two-dimensional models are more accurate than one-dimensional method, but according to existing modellers, delineation of probabilistic flooded areas based on Monte Carlo method is achievable via one-dimensional modeler. The applied software in this case study greatly responded to verify the research objectives. As a result, flood risk maps of the Richelieu River with the two applied models (1d, 2d) could elucidate the flood risk factors in hydrological, hydraulic, and managerial terms.Keywords: flood modeling, HEC-RAS, model comparison, Monte Carlo simulation, probabilistic flooded area, SRH-2D, WMS
Procedia PDF Downloads 1405297 Primary School Students’ Modeling Processes: Crime Problem
Authors: Neslihan Sahin Celik, Ali Eraslan
Abstract:
As a result of PISA (Program for International Student Assessments) survey that tests how well students can apply the knowledge and skills they have learned at school to real-life challenges, the new and redesigned mathematics education programs in many countries emphasize the necessity for the students to face complex and multifaceted problem situations and gain experience in this sense allowing them to develop new skills and mathematical thinking to prepare them for their future life after school. At this point, mathematical models and modeling approaches can be utilized in the analysis of complex problems which represent real-life situations in which students can actively participate. In particular, model eliciting activities that bring about situations which allow the students to create solutions to problems and which involve mathematical modeling must be used right from primary school years, allowing them to face such complex, real-life situations from early childhood period. A qualitative study was conducted in a university foundation primary school in the city center of a big province in 2013-2014 academic years. The participants were 4th grade students in a primary school. After a four-week preliminary study applied to a fourth-grade classroom, three students included in the focus group were selected using criterion sampling technique. A focus group of three students was videotaped as they worked on the Crime Problem. The conversation of the group was transcribed, examined with students’ written work and then analyzed through the lens of Blum and Ferri’s modeling processing cycle. The results showed that primary fourth-grade students can successfully work with model eliciting problem while they encounter some difficulties in the modeling processes. In particular, they developed new ideas based on different assumptions, identified the patterns among variables and established a variety of models. On the other hand, they had trouble focusing on problems and occasionally had breaks in the process.Keywords: primary school, modeling, mathematical modeling, crime problem
Procedia PDF Downloads 4045296 Reliability Assessment of Various Empirical Formulas for Prediction of Scour Hole Depth (Plunge Pool) Using a Comprehensive Physical Model
Authors: Majid Galoie, Khodadad Safavi, Abdolreza Karami Nejad, Reza Roshan
Abstract:
In this study, a comprehensive scouring model has been developed in order to evaluate the accuracy of various empirical relationships which were suggested for prediction of scour hole depth in plunge pools by Martins, Mason, Chian and Veronese. For this reason, scour hole depths caused by free falling jets from a flip bucket to a plunge pool were investigated. In this study various discharges, angles, scouring times, etc. have been considered. The final results demonstrated that the all mentioned empirical formulas, except Mason formula, were reasonably agreement with the experimental data.Keywords: scour hole depth, plunge pool, physical model, reliability assessment
Procedia PDF Downloads 5355295 Evaluating the Accuracy of Biologically Relevant Variables Generated by ClimateAP
Authors: Jing Jiang, Wenhuan XU, Lei Zhang, Shiyi Zhang, Tongli Wang
Abstract:
Climate data quality significantly affects the reliability of ecological modeling. In the Asia Pacific (AP) region, low-quality climate data hinders ecological modeling. ClimateAP, a software developed in 2017, generates high-quality climate data for the AP region, benefiting researchers in forestry and agriculture. However, its adoption remains limited. This study aims to confirm the validity of biologically relevant variable data generated by ClimateAP during the normal climate period through comparison with the currently available gridded data. Climate data from 2,366 weather stations were used to evaluate the prediction accuracy of ClimateAP in comparison with the commonly used gridded data from WorldClim1.4. Univariate regressions were applied to 48 monthly biologically relevant variables, and the relationship between the observational data and the predictions made by ClimateAP and WorldClim was evaluated using Adjusted R-Squared and Root Mean Squared Error (RMSE). Locations were categorized into mountainous and flat landforms, considering elevation, slope, ruggedness, and Topographic Position Index. Univariate regressions were then applied to all biologically relevant variables for each landform category. Random Forest (RF) models were implemented for the climatic niche modeling of Cunninghamia lanceolata. A comparative analysis of the prediction accuracies of RF models constructed with distinct climate data sources was conducted to evaluate their relative effectiveness. Biologically relevant variables were obtained from three unpublished Chinese meteorological datasets. ClimateAPv3.0 and WorldClim predictions were obtained from weather station coordinates and WorldClim1.4 rasters, respectively, for the normal climate period of 1961-1990. Occurrence data for Cunninghamia lanceolata came from integrated biodiversity databases with 3,745 unique points. ClimateAP explains a minimum of 94.74%, 97.77%, 96.89%, and 94.40% of monthly maximum, minimum, average temperature, and precipitation variances, respectively. It outperforms WorldClim in 37 biologically relevant variables with lower RMSE values. ClimateAP achieves higher R-squared values for the 12 monthly minimum temperature variables and consistently higher Adjusted R-squared values across all landforms for precipitation. ClimateAP's temperature data yields lower Adjusted R-squared values than gridded data in high-elevation, rugged, and mountainous areas but achieves higher values in mid-slope drainages, plains, open slopes, and upper slopes. Using ClimateAP improves the prediction accuracy of tree occurrence from 77.90% to 82.77%. The biologically relevant climate data produced by ClimateAP is validated based on evaluations using observations from weather stations. The use of ClimateAP leads to an improvement in data quality, especially in non-mountainous regions. The results also suggest that using biologically relevant variables generated by ClimateAP can slightly enhance climatic niche modeling for tree species, offering a better understanding of tree species adaptation and resilience compared to using gridded data.Keywords: climate data validation, data quality, Asia pacific climate, climatic niche modeling, random forest models, tree species
Procedia PDF Downloads 685294 Data Modeling and Calibration of In-Line Pultrusion and Laser Ablation Machine Processes
Authors: David F. Nettleton, Christian Wasiak, Jonas Dorissen, David Gillen, Alexandr Tretyak, Elodie Bugnicourt, Alejandro Rosales
Abstract:
In this work, preliminary results are given for the modeling and calibration of two inline processes, pultrusion, and laser ablation, using machine learning techniques. The end product of the processes is the core of a medical guidewire, manufactured to comply with a user specification of diameter and flexibility. An ensemble approach is followed which requires training several models. Two state of the art machine learning algorithms are benchmarked: Kernel Recursive Least Squares (KRLS) and Support Vector Regression (SVR). The final objective is to build a precise digital model of the pultrusion and laser ablation process in order to calibrate the resulting diameter and flexibility of a medical guidewire, which is the end product while taking into account the friction on the forming die. The result is an ensemble of models, whose output is within a strict required tolerance and which covers the required range of diameter and flexibility of the guidewire end product. The modeling and automatic calibration of complex in-line industrial processes is a key aspect of the Industry 4.0 movement for cyber-physical systems.Keywords: calibration, data modeling, industrial processes, machine learning
Procedia PDF Downloads 2985293 The Reliability Analysis of Concrete Chimneys Due to Random Vortex Shedding
Authors: Saba Rahman, Arvind K. Jain, S. D. Bharti, T. K. Datta
Abstract:
Chimneys are generally tall and slender structures with circular cross-sections, due to which they are highly prone to wind forces. Wind exerts pressure on the wall of the chimneys, which produces unwanted forces. Vortex-induced oscillation is one of such excitations which can lead to the failure of the chimneys. Therefore, vortex-induced oscillation of chimneys is of great concern to researchers and practitioners since many failures of chimneys due to vortex shedding have occurred in the past. As a consequence, extensive research has taken place on the subject over decades. Many laboratory experiments have been performed to verify the theoretical models proposed to predict vortex-induced forces, including aero-elastic effects. Comparatively, very few proto-type measurement data have been recorded to verify the proposed theoretical models. Because of this reason, the theoretical models developed with the help of experimental laboratory data are utilized for analyzing the chimneys for vortex-induced forces. This calls for reliability analysis of the predictions of the responses of the chimneys produced due to vortex shedding phenomena. Although several works of literature exist on the vortex-induced oscillation of chimneys, including code provisions, the reliability analysis of chimneys against failure caused due to vortex shedding is scanty. In the present study, the reliability analysis of chimneys against vortex shedding failure is presented, assuming the uncertainty in vortex shedding phenomena to be significantly more than other uncertainties, and hence, the latter is ignored. The vortex shedding is modeled as a stationary random process and is represented by a power spectral density function (PSDF). It is assumed that the vortex shedding forces are perfectly correlated and act over the top one-third height of the chimney. The PSDF of the tip displacement of the chimney is obtained by performing a frequency domain spectral analysis using a matrix approach. For this purpose, both chimney and random wind forces are discretized over a number of points along with the height of the chimney. The method of analysis duly accounts for the aero-elastic effects. The double barrier threshold crossing level, as proposed by Vanmarcke, is used for determining the probability of crossing different threshold levels of the tip displacement of the chimney. Assuming the annual distribution of the mean wind velocity to be a Gumbel type-I distribution, the fragility curve denoting the variation of the annual probability of threshold crossing against different threshold levels of the tip displacement of the chimney is determined. The reliability estimate is derived from the fragility curve. A 210m tall concrete chimney with a base diameter of 35m, top diameter as 21m, and thickness as 0.3m has been taken as an illustrative example. The terrain condition is assumed to be that corresponding to the city center. The expression for the PSDF of the vortex shedding force is taken to be used by Vickery and Basu. The results of the study show that the threshold crossing reliability of the tip displacement of the chimney is significantly influenced by the assumed structural damping and the Gumbel distribution parameters. Further, the aero-elastic effect influences the reliability estimate to a great extent for small structural damping.Keywords: chimney, fragility curve, reliability analysis, vortex-induced vibration
Procedia PDF Downloads 1605292 Fault Ride Through Management in Renewable Power Park
Authors: Mohd Zamri Che Wanik
Abstract:
This paper presents the management of the Fault Ride Through event within a Solar Farm during a grid fault. The modeling and simulation of a photovoltaic (PV) with battery energy storage connected to the power network will be described. The modeling approach and the study analysis performed are described. The model and operation scenarios are simulated using a digital simulator for different scenarios. The dynamic response of the system when subjected to sudden self-clearance temporary fault is presented. The capability of the PV system and battery storage riding through the power system fault and, at the same time, supporting the local grid by injecting fault current is demonstrated. For each case, the different control methods to achieve the objective of supporting the grid according to grid code requirements are presented and explained. The inverter modeling approach is presented and described.Keywords: faut ride through, solar farm, grid code, power network
Procedia PDF Downloads 515291 Turbulence Modeling and Wave-Current Interactions
Authors: A. C. Bennis, F. Dumas, F. Ardhuin, B. Blanke
Abstract:
The mechanics of rip currents are complex, involving interactions between waves, currents, water levels and the bathymetry, that present particular challenges for numerical models. Here, the effects of a grid-spacing dependent horizontal mixing on the wave-current interactions are studied. Near the shore, wave rays diverge from channels towards bar crests because of refraction by topography and currents, in a way that depends on the rip current intensity which is itself modulated by the horizontal mixing. At low resolution with the grid-spacing dependent horizontal mixing, the wave motion is the same for both coupling modes because the wave deviation by the currents is weak. In high-resolution case, however, classical results are found with the stabilizing effect of the flow by feedback of waves on currents. Lastly, wave-current interactions and the horizontal mixing strongly affect the intensity of the three-dimensional rip velocity.Keywords: numerical modeling, wave-current interactions, turbulence modeling, rip currents
Procedia PDF Downloads 4665290 A Stock Exchange Analysis in Turkish Logistics Sector: Modeling, Forecasting, and Comparison with Logistics Indices
Authors: Eti Mizrahi, Gizem İntepe
Abstract:
The geographical location of Turkey that stretches from Asia to Europe and Russia to Africa makes it an important logistics hub in the region. Although logistics is a developing sector in Turkey, the stock market representation is still low with only two companies listed in Turkey’s stock exchange since 2010. In this paper, we use the daily values of these two listed stocks as a benchmark for the logistics sector. After modeling logistics stock prices, an empirical examination is conducted between the existing logistics indices and these stock prices. The paper investigates whether the measures of logistics stocks are correlated with newly available logistics indices. It also shows the reflection of the economic activity in the logistics sector on the stock exchange market. The results presented in this paper are the first analysis of the behavior of logistics indices and logistics stock prices for Turkey.Keywords: forecasting, logistic stock exchange, modeling, Africa
Procedia PDF Downloads 5415289 Development of Building Information Modeling for Cultural Heritage: The Case of West Theater in Gadara (Umm Qais), Jordan
Authors: Amal Alatar
Abstract:
The architectural legacy is considered a significant factor, which left its features on the shape of buildings and historical and archaeological sites all over the world. In this framework, this paper focuses on Umm Qais town, located in Northern Jordan, which includes archaeological remains of the ancient Decapolis city of Gadara, still the witness of the originality and architectural identity of the city. 3D modeling is a public asset and a valuable resource for cultural heritage. This technique allows the possibility to make accurate representations of objects, structures, and surfaces. Hence, these representations increase valuable assets when thinking about cultural heritage. The Heritage Building Information Modeling (HBIM) is considered an effective tool to represent information on Cultural Heritage (CH) which can be used for documentation, restoration, conservation, presentation, and research purposes. Therefore, this paper focus on the interdisciplinary project of the virtualization of the West Theater in Gadara (Umm Qais) for 3D documentation and structural studies. The derived 3D model of the cultural heritage is the basis for further archaeological studies; the challenges of the work stay in the acquisition, processing, and integration of the multi-resolution data as well as their interactive visualization.Keywords: archaeology, 3D modeling, Umm Qais, culture heritage, Jordan
Procedia PDF Downloads 101