Search results for: quantitative modelling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5102

Search results for: quantitative modelling

4712 The Difference of Learning Outcomes in Reading Comprehension between Text and Film as The Media in Indonesian Language for Foreign Speaker in Intermediate Level

Authors: Siti Ayu Ningsih

Abstract:

This study aims to find the differences outcomes in learning reading comprehension with text and film as media on Indonesian Language for foreign speaker (BIPA) learning at intermediate level. By using quantitative and qualitative research methods, the respondent of this study is a single respondent from D'Royal Morocco Integrative Islamic School in grade nine from secondary level. Quantitative method used to calculate the learning outcomes that have been given the appropriate action cycle, whereas qualitative method used to translate the findings derived from quantitative methods to be described. The technique used in this study is the observation techniques and testing work. Based on the research, it is known that the use of the text media is more effective than the film for intermediate level of Indonesian Language for foreign speaker learner. This is because, when using film the learner does not have enough time to take note the difficult vocabulary and don't have enough time to look for the meaning of the vocabulary from the dictionary. While the use of media texts shows the better effectiveness because it does not require additional time to take note the difficult words. For the words that are difficult or strange, the learner can immediately find its meaning from the dictionary. The presence of the text is also very helpful for Indonesian Language for foreign speaker learner to find the answers according to the questions more easily. By matching the vocabulary of the question into the text references.

Keywords: Indonesian language for foreign speaker, learning outcome, media, reading comprehension

Procedia PDF Downloads 179
4711 Research on the Ecological Impact Evaluation Index System of Transportation Construction Projects

Authors: Yu Chen, Xiaoguang Yang, Lin Lin

Abstract:

Traffic engineering construction is an important infrastructure for economic and social development. In the process of construction and operation, the ability to make a correct evaluation of the project's environmental impact appears to be crucial to the rational operation of existing transportation projects, the correct development of transportation engineering construction and the adoption of corresponding measures to scientifically carry out environmental protection work. Most of the existing research work on ecological and environmental impact assessment is limited to individual aspects of the environment and less to the overall evaluation of the environmental system; in terms of research conclusions, there are more qualitative analyses from the technical and policy levels, and there is a lack of quantitative research results and quantitative and operable evaluation models. In this paper, a comprehensive analysis of the ecological and environmental impacts of transportation construction projects is conducted, and factors such as the accessibility of data and the reliability of calculation results are comprehensively considered to extract indicators that can reflect the essence and characteristics. The qualitative evaluation indicators were screened using the expert review method, the qualitative indicators were measured using the fuzzy statistics method, the quantitative indicators were screened using the principal component analysis method, and the quantitative indicators were measured by both literature search and calculation. An environmental impact evaluation index system with the general objective layer, sub-objective layer and indicator layer was established, dividing the environmental impact of the transportation construction project into two periods: the construction period and the operation period. On the basis of the evaluation index system, the index weights are determined using the hierarchical analysis method, and the individual indicators to be evaluated are dimensionless, eliminating the influence of the original background and meaning of the indicators. Finally, the thesis uses the above research results, combined with the actual engineering practice, to verify the correctness and operability of the evaluation method.

Keywords: transportation construction projects, ecological and environmental impact, analysis and evaluation, indicator evaluation system

Procedia PDF Downloads 79
4710 Effects of Nano-Coating on the Mechanical Behavior of Nanoporous Metals

Authors: Yunus Onur Yildiz, Mesut Kirca

Abstract:

In this study, mechanical properties of a nanoporous metal coated with a different metallic material are studied through a new atomistic modelling technique and molecular dynamics (MD) simulations. This new atomistic modelling technique is based on the Voronoi tessellation method for the purpose of geometric representation of the ligaments. With the proposed technique, atomistic models of nanoporous metals which have randomly oriented ligaments with non-uniform mass distribution along the ligament axis can be generated by enabling researchers to control both ligament length and diameter. Furthermore, by the utilization of this technique, atomistic models of coated nanoporous materials can be numerically obtained for further mechanical or thermal characterization. In general, this study consists of two stages. At the first stage, we use algorithms developed for generating atomic coordinates of the coated nanoporous material. In this regard, coordinates of randomly distributed points are determined in a controlled way to be employed in the establishment of the Voronoi tessellation, which results in randomly oriented and intersected line segments. Then, line segment representation of the Voronoi tessellation is transformed to atomic structure by a special process. This special process includes generation of non-uniform volumetric core region in which atoms can be generated based on a specific crystal structure. As an extension, this technique can be used for coating of nanoporous structures by creating another volumetric region encapsulating the core region in which atoms for the coating material are generated. The ultimate goal of the study at this stage is to generate atomic coordinates that can be employed in the MD simulations of randomly organized coated nanoporous structures. At the second stage of the study, mechanical behavior of the coated nanoporous models is investigated by examining deformation mechanisms through MD simulations. In this way, the effect of coating on the mechanical behavior of the selected material couple is investigated.

Keywords: atomistic modelling, molecular dynamic, nanoporous metals, voronoi tessellation

Procedia PDF Downloads 264
4709 Comparative Analysis of the Computer Methods' Usage for Calculation of Hydrocarbon Reserves in the Baltic Sea

Authors: Pavel Shcherban, Vlad Golovanov

Abstract:

Nowadays, the depletion of hydrocarbon deposits on the land of the Kaliningrad region leads to active geological exploration and development of oil and natural gas reserves in the southeastern part of the Baltic Sea. LLC 'Lukoil-Kaliningradmorneft' implements a comprehensive program for the development of the region's shelf in 2014-2023. Due to heterogeneity of reservoir rocks in various open fields, as well as with ambiguous conclusions on the contours of deposits, additional geological prospecting and refinement of the recoverable oil reserves are carried out. The key element is use of an effective technique of computer stock modeling at the first stage of processing of the received data. The following step uses information for the cluster analysis, which makes it possible to optimize the field development approaches. The article analyzes the effectiveness of various methods for reserves' calculation and computer modelling methods of the offshore hydrocarbon fields. Cluster analysis allows to measure influence of the obtained data on the development of a technical and economic model for mining deposits. The relationship between the accuracy of the calculation of recoverable reserves and the need of modernization of existing mining infrastructure, as well as the optimization of the scheme of opening and development of oil deposits, is observed.

Keywords: cluster analysis, computer modelling of deposits, correction of the feasibility study, offshore hydrocarbon fields

Procedia PDF Downloads 143
4708 Spatial Organization of Organelles in Living Cells: Insights from Mathematical Modelling

Authors: Congping Lin

Abstract:

Intracellular transport in fungi has a number of important roles in, e.g., filamentous fungal growth and cellular metabolism. Two basic mechanisms for intracellular transport are motor-driven trafficking along microtubules (MTs) and diffusion. Mathematical modelling has been actively developed to understand such intracellular transport and provide unique insight into cellular complexity. Based on live-cell imaging data in Ustilago hyphal cells, probabilistic models have been developed to study mechanism underlying spatial organization of molecular motors and organelles. In particular, anther mechanism - stochastic motility of dynein motors along MTs has been found to contribute to half of its accumulation at hyphal tip in order to support early endosome (EE) recycling. The EE trafficking not only facilitates the directed motion of peroxisomes but also enhances their diffusive motion. Considering the importance of spatial organization of early endosomes in supporting peroxisome movement, computational and experimental approaches have been combined to a whole-cell level. Results from this interdisciplinary study promise insights into requirements for other membrane trafficking systems (e.g., in neurons), but also may inform future 'synthetic biology' studies.

Keywords: intracellular transport, stochastic process, molecular motors, spatial organization

Procedia PDF Downloads 108
4707 A Comparison of Tsunami Impact to Sydney Harbour, Australia at Different Tidal Stages

Authors: Olivia A. Wilson, Hannah E. Power, Murray Kendall

Abstract:

Sydney Harbour is an iconic location with a dense population and low-lying development. On the east coast of Australia, facing the Pacific Ocean, it is exposed to several tsunamigenic trenches. This paper presents a component of the most detailed assessment of the potential for earthquake-generated tsunami impact on Sydney Harbour to date. Models in this study use dynamic tides to account for tide-tsunami interaction. Sydney Harbour’s tidal range is 1.5 m, and the spring tides from January 2015 that are used in the modelling for this study are close to the full tidal range. The tsunami wave trains modelled include hypothetical tsunami generated from earthquakes of magnitude 7.5, 8.0, 8.5, and 9.0 MW from the Puysegur and New Hebrides trenches as well as representations of the historical 1960 Chilean and 2011 Tohoku events. All wave trains are modelled for the peak wave to coincide with both a low tide and a high tide. A single wave train, representing a 9.0 MW earthquake at the Puysegur trench, is modelled for peak waves to coincide with every hour across a 12-hour tidal phase. Using the hydrodynamic model ANUGA, results are compared according to the impact parameters of inundation area, depth variation and current speeds. Results show that both maximum inundation area and depth variation are tide dependent. Maximum inundation area increases when coincident with a higher tide, however, hazardous inundation is only observed for the larger waves modelled: NH90high and P90high. The maximum and minimum depths are deeper on higher tides and shallower on lower tides. The difference between maximum and minimum depths varies across different tidal phases although the differences are slight. Maximum current speeds are shown to be a significant hazard for Sydney Harbour; however, they do not show consistent patterns according to tide-tsunami phasing. The maximum current speed hazard is shown to be greater in specific locations such as Spit Bridge, a narrow channel with extensive marine infrastructure. The results presented for Sydney Harbour are novel, and the conclusions are consistent with previous modelling efforts in the greater area. It is shown that tide must be a consideration for both tsunami modelling and emergency management planning. Modelling with peak tsunami waves coinciding with a high tide would be a conservative approach; however, it must be considered that maximum current speeds may be higher on other tides.

Keywords: emergency management, sydney, tide-tsunami interaction, tsunami impact

Procedia PDF Downloads 219
4706 Graphic Procession Unit-Based Parallel Processing for Inverse Computation of Full-Field Material Properties Based on Quantitative Laser Ultrasound Visualization

Authors: Sheng-Po Tseng, Che-Hua Yang

Abstract:

Motivation and Objective: Ultrasonic guided waves become an important tool for nondestructive evaluation of structures and components. Guided waves are used for the purpose of identifying defects or evaluating material properties in a nondestructive way. While guided waves are applied for evaluating material properties, instead of knowing the properties directly, preliminary signals such as time domain signals or frequency domain spectra are first revealed. With the measured ultrasound data, inversion calculation can be further employed to obtain the desired mechanical properties. Methods: This research is development of high speed inversion calculation technique for obtaining full-field mechanical properties from the quantitative laser ultrasound visualization system (QLUVS). The quantitative laser ultrasound visualization system (QLUVS) employs a mirror-controlled scanning pulsed laser to generate guided acoustic waves traveling in a two-dimensional target. Guided waves are detected with a piezoelectric transducer located at a fixed location. With a gyro-scanning of the generation source, the QLUVS has the advantage of fast, full-field, and quantitative inspection. Results and Discussions: This research introduces two important tools to improve the computation efficiency. Firstly, graphic procession unit (GPU) with large amount of cores are introduced. Furthermore, combining the CPU and GPU cores, parallel procession scheme is developed for the inversion of full-field mechanical properties based on the QLUVS data. The newly developed inversion scheme is applied to investigate the computation efficiency for single-layered and double-layered plate-like samples. The computation efficiency is shown to be 80 times faster than unparalleled computation scheme. Conclusions: This research demonstrates a high-speed inversion technique for the characterization of full-field material properties based on quantitative laser ultrasound visualization system. Significant computation efficiency is shown, however not reaching the limit yet. Further improvement can be reached by improving the parallel computation. Utilizing the development of the full-field mechanical property inspection technology, full-field mechanical property measured by non-destructive, high-speed and high-precision measurements can be obtained in qualitative and quantitative results. The developed high speed computation scheme is ready for applications where full-field mechanical properties are needed in a nondestructive and nearly real-time way.

Keywords: guided waves, material characterization, nondestructive evaluation, parallel processing

Procedia PDF Downloads 179
4705 A Conceptual E-Business Model and the Effect of Strategic Planning Parameters on E-Business Strategy Management and Performance

Authors: Alexandra Lipitakis, Evangelia A. E. C. Lipitakis

Abstract:

In this article, a class of e-business strategy planning parameters are introduced and their effect on financial and non-financial performance of e-businesses and organizations is investigated. The relationships between these strategic planning parameters, i.e. Formality, Participation, Sophistication, Thoroughness, Synergy and Cooperation, Entropic Factor, Adaptivity, Uncertainty and Financial and Non-Financial Performance are examined and the directions of these relationships are given. A conceptual model has been constructed and quantitative research methods can be used to test the considered eight hypotheses. In the framework of e-business strategy planning this research study clearly demonstrates how strategic planning components have positive relationships with e-business strategy management and performance.

Keywords: e-business management, e-business model, e-business performance assessments, strategy management methodologies, strategy planning, quantitative methods

Procedia PDF Downloads 364
4704 Application of Human Biomonitoring and Physiologically-Based Pharmacokinetic Modelling to Quantify Exposure to Selected Toxic Elements in Soil

Authors: Eric Dede, Marcus Tindall, John W. Cherrie, Steve Hankin, Christopher Collins

Abstract:

Current exposure models used in contaminated land risk assessment are highly conservative. Use of these models may lead to over-estimation of actual exposures, possibly resulting in negative financial implications due to un-necessary remediation. Thus, we are carrying out a study seeking to improve our understanding of human exposure to selected toxic elements in soil: arsenic (As), cadmium (Cd), chromium (Cr), nickel (Ni), and lead (Pb) resulting from allotment land-use. The study employs biomonitoring and physiologically-based pharmacokinetic (PBPK) modelling to quantify human exposure to these elements. We recruited 37 allotment users (adults > 18 years old) in Scotland, UK, to participate in the study. Concentrations of the elements (and their bioaccessibility) were measured in allotment samples (soil and allotment produce). Amount of produce consumed by the participants and participants’ biological samples (urine and blood) were collected for up to 12 consecutive months. Ethical approval was granted by the University of Reading Research Ethics Committee. PBPK models (coded in MATLAB) were used to estimate the distribution and accumulation of the elements in key body compartments, thus indicating the internal body burden. Simulating low element intake (based on estimated ‘doses’ from produce consumption records), predictive models suggested that detection of these elements in urine and blood was possible within a given period of time following exposure. This information was used in planning biomonitoring, and is currently being used in the interpretation of test results from biological samples. Evaluation of the models is being carried out using biomonitoring data, by comparing model predicted concentrations and measured biomarker concentrations. The PBPK models will be used to generate bioavailability values, which could be incorporated in contaminated land exposure models. Thus, the findings from this study will promote a more sustainable approach to contaminated land management.

Keywords: biomonitoring, exposure, PBPK modelling, toxic elements

Procedia PDF Downloads 299
4703 Quantitative Analysis of the Trade Potential of the United States with Members of the European Union: A Gravity Model Approach

Authors: Zahid Ahmad, Nauman Ali

Abstract:

This study has estimated the trade between USA and individual members of European Union using Gravity Model of Trade as The USA has a complex trade relationship with the European countries consist of a large number of consumers, which make USA dependent on EU for major of its total world trade. However, among the member of EU, the trade potential of USA with individual members of EU is not known. Panel data techniques e.g. Random Effect, Fixed Effect and Pooled Panel have been applied to secondary quantitative data to analyze the Trade between USA and EU. Trade Potential of USA with individual members of EU has been obtained using the ratio of Actual trade of USA with EU members and the trade as predicted by Gravity Model. The Study concluded that the USA has greater trade potential with 16 members of EU, including Croatia, Portugal and United Kingdom on top. On the other hand, Finland, Ireland, and France are the top countries with which the USA has exhaustive trade potential.

Keywords: analytical technique, economic, gravity, international trade, significant

Procedia PDF Downloads 282
4702 Application of Data Driven Based Models as Early Warning Tools of High Stream Flow Events and Floods

Authors: Mohammed Seyam, Faridah Othman, Ahmed El-Shafie

Abstract:

The early warning of high stream flow events (HSF) and floods is an important aspect in the management of surface water and rivers systems. This process can be performed using either process-based models or data driven-based models such as artificial intelligence (AI) techniques. The main goal of this study is to develop efficient AI-based model for predicting the real-time hourly stream flow (Q) and apply it as early warning tool of HSF and floods in the downstream area of the Selangor River basin, taken here as a paradigm of humid tropical rivers in Southeast Asia. The performance of AI-based models has been improved through the integration of the lag time (Lt) estimation in the modelling process. A total of 8753 patterns of Q, water level, and rainfall hourly records representing one-year period (2011) were utilized in the modelling process. Six hydrological scenarios have been arranged through hypothetical cases of input variables to investigate how the changes in RF intensity in upstream stations can lead formation of floods. The initial SF was changed for each scenario in order to include wide range of hydrological situations in this study. The performance evaluation of the developed AI-based model shows that high correlation coefficient (R) between the observed and predicted Q is achieved. The AI-based model has been successfully employed in early warning throughout the advance detection of the hydrological conditions that could lead to formations of floods and HSF, where represented by three levels of severity (i.e., alert, warning, and danger). Based on the results of the scenarios, reaching the danger level in the downstream area required high RF intensity in at least two upstream areas. According to results of applications, it can be concluded that AI-based models are beneficial tools to the local authorities for flood control and awareness.

Keywords: floods, stream flow, hydrological modelling, hydrology, artificial intelligence

Procedia PDF Downloads 225
4701 Optimisation of Structural Design by Integrating Genetic Algorithms in the Building Information Modelling Environment

Authors: Tofigh Hamidavi, Sepehr Abrishami, Pasquale Ponterosso, David Begg

Abstract:

Structural design and analysis is an important and time-consuming process, particularly at the conceptual design stage. Decisions made at this stage can have an enormous effect on the entire project, as it becomes ever costlier and more difficult to alter the choices made early on in the construction process. Hence, optimisation of the early stages of structural design can provide important efficiencies in terms of cost and time. This paper suggests a structural design optimisation (SDO) framework in which Genetic Algorithms (GAs) may be used to semi-automate the production and optimisation of early structural design alternatives. This framework has the potential to leverage conceptual structural design innovation in Architecture, Engineering and Construction (AEC) projects. Moreover, this framework improves the collaboration between the architectural stage and the structural stage. It will be shown that this SDO framework can make this achievable by generating the structural model based on the extracted data from the architectural model. At the moment, the proposed SDO framework is in the process of validation, involving the distribution of an online questionnaire among structural engineers in the UK.

Keywords: building information, modelling, BIM, genetic algorithm, GA, architecture-engineering-construction, AEC, optimisation, structure, design, population, generation, selection, mutation, crossover, offspring

Procedia PDF Downloads 212
4700 Geospatial Analysis of Hydrological Response to Forest Fires in Small Mediterranean Catchments

Authors: Bojana Horvat, Barbara Karleusa, Goran Volf, Nevenka Ozanic, Ivica Kisic

Abstract:

Forest fire is a major threat in many regions in Croatia, especially in coastal areas. Although they are often caused by natural processes, the most common cause is the human factor, intentional or unintentional. Forest fires drastically transform landscapes and influence natural processes. The main goal of the presented research is to analyse and quantify the impact of the forest fire on hydrological processes and propose the model that best describes changes in hydrological patterns in the analysed catchments. Keeping in mind the spatial component of the processes, geospatial analysis is performed to gain better insight into the spatial variability of the hydrological response to disastrous events. In that respect, two catchments that experienced severe forest fire were delineated, and various hydrological and meteorological data were collected both attribute and spatial. The major drawback is certainly the lack of hydrological data, common in small torrential karstic streams; hence modelling results should be validated with the data collected in the catchment that has similar characteristics and established hydrological monitoring. The event chosen for the modelling is the forest fire that occurred in July 2019 and burned nearly 10% of the analysed area. Surface (land use/land cover) conditions before and after the event were derived from the two Sentinel-2 images. The mapping of the burnt area is based on a comparison of the Normalized Burn Index (NBR) computed from both images. To estimate and compare hydrological behaviour before and after the event, curve number (CN) values are assigned to the land use/land cover classes derived from the satellite images. Hydrological modelling resulted in surface runoff generation and hence prediction of hydrological responses in the catchments to a forest fire event. The research was supported by the Croatian Science Foundation through the project 'Influence of Open Fires on Water and Soil Quality' (IP-2018-01-1645).

Keywords: Croatia, forest fire, geospatial analysis, hydrological response

Procedia PDF Downloads 104
4699 Simulation of Scaled Model of Tall Multistory Structure: Raft Foundation for Experimental and Numerical Dynamic Studies

Authors: Omar Qaftan

Abstract:

Earthquakes can cause tremendous loss of human life and can result in severe damage to a several of civil engineering structures especially the tall buildings. The response of a multistory structure subjected to earthquake loading is a complex task, and it requires to be studied by physical and numerical modelling. For many circumstances, the scale models on shaking table may be a more economical option than the similar full-scale tests. A shaking table apparatus is a powerful tool that offers a possibility of understanding the actual behaviour of structural systems under earthquake loading. It is required to use a set of scaling relations to predict the behaviour of the full-scale structure. Selecting the scale factors is the most important steps in the simulation of the prototype into the scaled model. In this paper, the principles of scaling modelling procedure are explained in details, and the simulation of scaled multi-storey concrete structure for dynamic studies is investigated. A procedure for a complete dynamic simulation analysis is investigated experimentally and numerically with a scale factor of 1/50. The frequency domain accounting and lateral displacement for both numerical and experimental scaled models are determined. The procedure allows accounting for the actual dynamic behave of actual size porotype structure and scaled model. The procedure is adapted to determine the effects of the tall multi-storey structure on a raft foundation. Four generated accelerograms were used as inputs for the time history motions which are in complying with EC8. The output results of experimental works expressed regarding displacements and accelerations are compared with those obtained from a conventional fixed-base numerical model. Four-time history was applied in both experimental and numerical models, and they concluded that the experimental has an acceptable output accuracy in compare with the numerical model output. Therefore this modelling methodology is valid and qualified for different shaking table experiments tests.

Keywords: structure, raft, soil, interaction

Procedia PDF Downloads 110
4698 Hidden Markov Movement Modelling with Irregular Data

Authors: Victoria Goodall, Paul Fatti, Norman Owen-Smith

Abstract:

Hidden Markov Models have become popular for the analysis of animal tracking data. These models are being used to model the movements of a variety of species in many areas around the world. A common assumption of the model is that the observations need to have regular time steps. In many ecological studies, this will not be the case. The objective of the research is to modify the movement model to allow for irregularly spaced locations and investigate the effect on the inferences which can be made about the latent states. A modification of the likelihood function to allow for these irregular spaced locations is investigated, without using interpolation or averaging the movement rate. The suitability of the modification is investigated using GPS tracking data for lion (Panthera leo) in South Africa, with many observations obtained during the night, and few observations during the day. Many nocturnal predator tracking studies are set up in this way, to obtain many locations at night when the animal is most active and is difficult to observe. Few observations are obtained during the day, when the animal is expected to rest and is potentially easier to observe. Modifying the likelihood function allows the popular Hidden Markov Model framework to be used to model these irregular spaced locations, making use of all the observed data.

Keywords: hidden Markov Models, irregular observations, animal movement modelling, nocturnal predator

Procedia PDF Downloads 227
4697 Effectiveness of Control Measures for Ambient Fine Particulate Matters Concentration Improvement in Taiwan

Authors: Jiun-Horng Tsai, Shi-Jie, Nieh

Abstract:

Fine particulate matter (PM₂.₅) has become an important issue all over the world over the last decade. Annual mean PM₂.₅ concentration has been over the ambient air quality standard of PM₂.₅ (annual average concentration as 15μg/m³) which adapted by Taiwan Environmental Protection Administration (TEPA). TEPA, therefore, has developed a number of air pollution control measures to improve the ambient concentration by reducing the emissions of primary fine particulate matter and the precursors of secondary PM₂.₅. This study investigated the potential improvement of ambient PM₂.₅ concentration by the TEPA program and the other scenario for further emission reduction on various sources. Four scenarios had been evaluated in this study, including a basic case and three reduction scenarios (A to C). The ambient PM₂.₅ concentration was evaluated by Community Multi-scale Air Quality modelling system (CMAQ) ver. 4.7.1 along with the Weather Research and Forecasting Model (WRF) ver. 3.4.1. The grid resolutions in the modelling work are 81 km × 81 km for domain 1 (covers East Asia), 27 km × 27 km for domain 2 (covers Southeast China and Taiwan), and 9 km × 9 km for domain 3 (covers Taiwan). The result of PM₂.₅ concentration simulation in different regions of Taiwan shows that the annual average concentration of basic case is 24.9 μg/m³, and are 22.6, 18.8, and 11.3 μg/m³, respectively, for scenarios A to C. The annual average concentration of PM₂.₅ would be reduced by 9-55 % for those control scenarios. The result of scenario C (the emissions of precursors reduce to allowance levels) could improve effectively the airborne PM₂.₅ concentration to attain the air quality standard. According to the results of unit precursor reduction contribution, the allowance emissions of PM₂.₅, SOₓ, and NOₓ are 16.8, 39, and 62 thousand tons per year, respectively. In the Kao-Ping air basin, the priority for reducing precursor emissions is PM₂.₅ > NOₓ > SOₓ, whereas the priority for reducing precursor emissions is PM₂.₅ > SOₓ > NOₓ in others area. The result indicates that the target pollutants that need to be reduced in different air basin are different, and the control measures need to be adapted to local conditions.

Keywords: airborne PM₂.₅, community multi-scale air quality modelling system, control measures, weather research and forecasting model

Procedia PDF Downloads 116
4696 An Automatic Generating Unified Modelling Language Use Case Diagram and Test Cases Based on Classification Tree Method

Authors: Wassana Naiyapo, Atichat Sangtong

Abstract:

The processes in software development by Object Oriented methodology have many stages those take time and high cost. The inconceivable error in system analysis process will affect to the design and the implementation process. The unexpected output causes the reason why we need to revise the previous process. The more rollback of each process takes more expense and delayed time. Therefore, the good test process from the early phase, the implemented software is efficient, reliable and also meet the user’s requirement. Unified Modelling Language (UML) is the tool which uses symbols to describe the work process in Object Oriented Analysis (OOA). This paper presents the approach for automatically generated UML use case diagram and test cases. UML use case diagram is generated from the event table and test cases are generated from use case specifications and Graphic User Interfaces (GUI). Test cases are derived from the Classification Tree Method (CTM) that classify data to a node present in the hierarchy structure. Moreover, this paper refers to the program that generates use case diagram and test cases. As the result, it can reduce work time and increase efficiency work.

Keywords: classification tree method, test case, UML use case diagram, use case specification

Procedia PDF Downloads 136
4695 Domain Driven Design vs Soft Domain Driven Design Frameworks

Authors: Mohammed Salahat, Steve Wade

Abstract:

This paper presents and compares the SSDDD “Systematic Soft Domain Driven Design Framework” to DDD “Domain Driven Design Framework” as a soft system approach of information systems development. The framework use SSM as a guiding methodology within which we have embedded a sequence of design tasks based on the UML leading to the implementation of a software system using the Naked Objects framework. This framework has been used in action research projects that have involved the investigation and modelling of business processes using object-oriented domain models and the implementation of software systems based on those domain models. Within this framework, Soft Systems Methodology (SSM) is used as a guiding methodology to explore the problem situation and to develop the domain model using UML for the given business domain. The framework is proposed and evaluated in our previous works, a comparison between SSDDD and DDD is presented in this paper, to show how SSDDD improved DDD as an approach to modelling and implementing business domain perspectives for Information Systems Development. The comparison process, the results, and the improvements are presented in the following sections of this paper.

Keywords: domain-driven design, soft domain-driven design, naked objects, soft language

Procedia PDF Downloads 272
4694 Online Battery Equivalent Circuit Model Estimation on Continuous-Time Domain Using Linear Integral Filter Method

Authors: Cheng Zhang, James Marco, Walid Allafi, Truong Q. Dinh, W. D. Widanage

Abstract:

Equivalent circuit models (ECMs) are widely used in battery management systems in electric vehicles and other battery energy storage systems. The battery dynamics and the model parameters vary under different working conditions, such as different temperature and state of charge (SOC) levels, and therefore online parameter identification can improve the modelling accuracy. This paper presents a way of online ECM parameter identification using a continuous time (CT) estimation method. The CT estimation method has several advantages over discrete time (DT) estimation methods for ECM parameter identification due to the widely separated battery dynamic modes and fast sampling. The presented method can be used for online SOC estimation. Test data are collected using a lithium ion cell, and the experimental results show that the presented CT method achieves better modelling accuracy compared with the conventional DT recursive least square method. The effectiveness of the presented method for online SOC estimation is also verified on test data.

Keywords: electric circuit model, continuous time domain estimation, linear integral filter method, parameter and SOC estimation, recursive least square

Procedia PDF Downloads 360
4693 Dynamic Mode Decomposition and Wake Flow Modelling of a Wind Turbine

Authors: Nor Mazlin Zahari, Lian Gan, Xuerui Mao

Abstract:

The power production in wind farms and the mechanical loads on the turbines are strongly impacted by the wake of the wind turbine. Thus, there is a need for understanding and modelling the turbine wake dynamic in the wind farm and the layout optimization. Having a good wake model is important in predicting plant performance and understanding fatigue loads. In this paper, the Dynamic Mode Decomposition (DMD) was applied to the simulation data generated by a Direct Numerical Simulation (DNS) of flow around a turbine, perturbed by upstream inflow noise. This technique is useful in analyzing the wake flow, to predict its future states and to reflect flow dynamics associated with the coherent structures behind wind turbine wake flow. DMD was employed to describe the dynamic of the flow around turbine from the DNS data. Since the DNS data comes with the unstructured meshes and non-uniform grid, the interpolation of each occurring within each element in the data to obtain an evenly spaced mesh was performed before the DMD was applied. DMD analyses were able to tell us characteristics of the travelling waves behind the turbine, e.g. the dominant helical flow structures and the corresponding frequencies. As the result, the dominant frequency will be detected, and the associated spatial structure will be identified. The dynamic mode which represented the coherent structure will be presented.

Keywords: coherent structure, Direct Numerical Simulation (DNS), dominant frequency, Dynamic Mode Decomposition (DMD)

Procedia PDF Downloads 316
4692 Review of Concepts and Tools Applied to Assess Risks Associated with Food Imports

Authors: A. Falenski, A. Kaesbohrer, M. Filter

Abstract:

Introduction: Risk assessments can be performed in various ways and in different degrees of complexity. In order to assess risks associated with imported foods additional information needs to be taken into account compared to a risk assessment on regional products. The present review is an overview on currently available best practise approaches and data sources used for food import risk assessments (IRAs). Methods: A literature review has been performed. PubMed was searched for articles about food IRAs published in the years 2004 to 2014 (English and German texts only, search string “(English [la] OR German [la]) (2004:2014 [dp]) import [ti] risk”). Titles and abstracts were screened for import risks in the context of IRAs. The finally selected publications were analysed according to a predefined questionnaire extracting the following information: risk assessment guidelines followed, modelling methods used, data and software applied, existence of an analysis of uncertainty and variability. IRAs cited in these publications were also included in the analysis. Results: The PubMed search resulted in 49 publications, 17 of which contained information about import risks and risk assessments. Within these 19 cross references were identified to be of interest for the present study. These included original articles, reviews and guidelines. At least one of the guidelines of the World Organisation for Animal Health (OIE) and the Codex Alimentarius Commission were referenced in any of the IRAs, either for import of animals or for imports concerning foods, respectively. Interestingly, also a combination of both was used to assess the risk associated with the import of live animals serving as the source of food. Methods ranged from full quantitative IRAs using probabilistic models and dose-response models to qualitative IRA in which decision trees or severity tables were set up using parameter estimations based on expert opinions. Calculations were done using @Risk, R or Excel. Most heterogeneous was the type of data used, ranging from general information on imported goods (food, live animals) to pathogen prevalence in the country of origin. These data were either publicly available in databases or lists (e.g., OIE WAHID and Handystatus II, FAOSTAT, Eurostat, TRACES), accessible on a national level (e.g., herd information) or only open to a small group of people (flight passenger import data at national airport customs office). In the IRAs, an uncertainty analysis has been mentioned in some cases, but calculations have been performed only in a few cases. Conclusion: The current state-of-the-art in the assessment of risks of imported foods is characterized by a great heterogeneity in relation to general methodology and data used. Often information is gathered on a case-by-case basis and reformatted by hand in order to perform the IRA. This analysis therefore illustrates the need for a flexible, modular framework supporting the connection of existing data sources with data analysis and modelling tools. Such an infrastructure could pave the way to IRA workflows applicable ad-hoc, e.g. in case of a crisis situation.

Keywords: import risk assessment, review, tools, food import

Procedia PDF Downloads 287
4691 Computational Chemical-Composition of Carbohydrates in the Context of Healthcare Informatics

Authors: S. Chandrasekaran, S. Nandita, M. Shivathmika, Srikrishnan Shivakumar

Abstract:

The objective of the research work is to analyze the computational chemical-composition of carbohydrates in the context of healthcare informatics. The computation involves the representation of complex chemical molecular structure of carbohydrate using graph theory and in a deployable Chemical Markup Language (CML). The parallel molecular structure of the chemical molecules with or without other adulterants for the sake of business profit can be analyzed in terms of robustness and derivatization measures. The rural healthcare program should create awareness in malnutrition to reduce ill-effect of decomposition and help the consumers to know the level of such energy storage mixtures in a quantitative way. The earlier works were based on the empirical and wet data which can vary from time to time but cannot be made to reuse the results of mining. The work is carried out on the quantitative computational chemistry on carbohydrates to provide a safe and secure right to food act and its regulations.

Keywords: carbohydrates, chemical-composition, chemical markup, robustness, food safety

Procedia PDF Downloads 359
4690 Facility Data Model as Integration and Interoperability Platform

Authors: Nikola Tomasevic, Marko Batic, Sanja Vranes

Abstract:

Emerging Semantic Web technologies can be seen as the next step in evolution of the intelligent facility management systems. Particularly, this considers increased usage of open source and/or standardized concepts for data classification and semantic interpretation. To deliver such facility management systems, providing the comprehensive integration and interoperability platform in from of the facility data model is a prerequisite. In this paper, one of the possible modelling approaches to provide such integrative facility data model which was based on the ontology modelling concept was presented. Complete ontology development process, starting from the input data acquisition, ontology concepts definition and finally ontology concepts population, was described. At the beginning, the core facility ontology was developed representing the generic facility infrastructure comprised of the common facility concepts relevant from the facility management perspective. To develop the data model of a specific facility infrastructure, first extension and then population of the core facility ontology was performed. For the development of the full-blown facility data models, Malpensa and Fiumicino airports in Italy, two major European air-traffic hubs, were chosen as a test-bed platform. Furthermore, the way how these ontology models supported the integration and interoperability of the overall airport energy management system was analyzed as well.

Keywords: airport ontology, energy management, facility data model, ontology modeling

Procedia PDF Downloads 420
4689 BIM Data and Digital Twin Framework: Preserving the Past and Predicting the Future

Authors: Mazharuddin Syed Ahmed

Abstract:

This research presents a framework used to develop The Ara Polytechnic College of Architecture Studies building “Kahukura” which is Green Building certified. This framework integrates the development of a smart building digital twin by utilizing Building Information Modelling (BIM) and its BIM maturity levels, including Levels of Development (LOD), eight dimensions of BIM, Heritage-BIM (H-BIM) and Facility Management BIM (FM BIM). The research also outlines a structured approach to building performance analysis and integration with the circular economy, encapsulated within a five-level digital twin framework. Starting with Level 1, the Descriptive Twin provides a live, editable visual replica of the built asset, allowing for specific data inclusion and extraction. Advancing to Level 2, the Informative Twin integrates operational and sensory data, enhancing data verification and system integration. At Level 3, the Predictive Twin utilizes operational data to generate insights and proactive management suggestions. Progressing to Level 4, the Comprehensive Twin simulates future scenarios, enabling robust “what-if” analyses. Finally, Level 5, the Autonomous Twin, represents the pinnacle of digital twin evolution, capable of learning and autonomously acting on behalf of users.

Keywords: building information modelling, circular economy integration, digital twin, predictive analytics

Procedia PDF Downloads 21
4688 A Quantitative Evaluation of Text Feature Selection Methods

Authors: B. S. Harish, M. B. Revanasiddappa

Abstract:

Due to rapid growth of text documents in digital form, automated text classification has become an important research in the last two decades. The major challenge of text document representations are high dimension, sparsity, volume and semantics. Since the terms are only features that can be found in documents, selection of good terms (features) plays an very important role. In text classification, feature selection is a strategy that can be used to improve classification effectiveness, computational efficiency and accuracy. In this paper, we present a quantitative analysis of most widely used feature selection (FS) methods, viz. Term Frequency-Inverse Document Frequency (tfidf ), Mutual Information (MI), Information Gain (IG), CHISquare (x2), Term Frequency-Relevance Frequency (tfrf ), Term Strength (TS), Ambiguity Measure (AM) and Symbolic Feature Selection (SFS) to classify text documents. We evaluated all the feature selection methods on standard datasets like 20 Newsgroups, 4 University dataset and Reuters-21578.

Keywords: classifiers, feature selection, text classification

Procedia PDF Downloads 430
4687 Flood Predicting in Karkheh River Basin Using Stochastic ARIMA Model

Authors: Karim Hamidi Machekposhti, Hossein Sedghi, Abdolrasoul Telvari, Hossein Babazadeh

Abstract:

Floods have huge environmental and economic impact. Therefore, flood prediction is given a lot of attention due to its importance. This study analysed the annual maximum streamflow (discharge) (AMS or AMD) of Karkheh River in Karkheh River Basin for flood predicting using ARIMA model. For this purpose, we use the Box-Jenkins approach, which contains four-stage method model identification, parameter estimation, diagnostic checking and forecasting (predicting). The main tool used in ARIMA modelling was the SAS and SPSS software. Model identification was done by visual inspection on the ACF and PACF. SAS software computed the model parameters using the ML, CLS and ULS methods. The diagnostic checking tests, AIC criterion, RACF graph and RPACF graphs, were used for selected model verification. In this study, the best ARIMA models for Annual Maximum Discharge (AMD) time series was (4,1,1) with their AIC value of 88.87. The RACF and RPACF showed residuals’ independence. To forecast AMD for 10 future years, this model showed the ability of the model to predict floods of the river under study in the Karkheh River Basin. Model accuracy was checked by comparing the predicted and observation series by using coefficient of determination (R2).

Keywords: time series modelling, stochastic processes, ARIMA model, Karkheh river

Procedia PDF Downloads 271
4686 Rural Households’ Resilience to Food Insecurity in Niger

Authors: Aboubakr Gambo, Adama Diaw, Tobias Wunscher

Abstract:

This study attempts to identify factors affecting rural households’ resilience to food insecurity in Niger. For this, we first create a resilience index by using Principal Component Analysis on the following five variables at the household level: income, food expenditure, duration of grain held in stock, livestock in Tropical Livestock Units and number of farms exploited and second apply Structural Equation Modelling to identify the determinants. Data from the 2010 National Survey on Households’ Vulnerability to Food Insecurity done by the National Institute of Statistics is used. The study shows that asset and social safety nets indicators are significant and have a positive impact on households’ resilience. Climate change approximated by long-term mean rainfall has a negative and significant effect on households’ resilience to food insecurity. The results indicate that to strengthen households’ resilience to food insecurity, there is a need to increase assistance to households through social safety nets and to help them gather more resources in order to acquire more assets. Furthermore, early warning of climatic events could alert households especially farmers to be prepared and avoid important losses that they experience anytime an uneven climatic event occur.

Keywords: food insecurity, principal component analysis, structural equation modelling, resilience

Procedia PDF Downloads 341
4685 Challenges and Opportunities in Modelling Energy Behavior of Household in Malaysia

Authors: Zuhaina Zakaria, Noraliza Hamzah, Siti Halijjah Shariff, Noor Aizah Abdul Karim

Abstract:

The residential sector in Malaysia has become the single largest energy sector accounting for 21% of the entire energy usage of the country. In the past 10 years, a number of energy efficiency initiatives in the residential sector had been undertaken by the government including. However, there is no clear evidence that the total residential energy consumption has been reduced substantially via these strategies. Household electrical appliances such as air conditioners, refrigerators, lighting and televisions are used depending on the consumers’ activities. The behavior of household occupants played an important role in energy consumption and influenced the operation of the physical devices. Therefore, in order to ensure success in energy efficiency program, it requires not only the technological aspect but also the consumers’ behaviors component. This paper focuses on the challenges and opportunities in modelling residential consumer behavior in Malaysia. A field survey to residential consumers was carried out and responses from the survey were analyzed to determine the consumers’ level of knowledge and awareness on energy efficiency. The analyses will be used in determining a right framework to explain household energy use intentions and behavior. These findings will be beneficial to power utility company and energy regulator in addressing energy efficiency related issues.

Keywords: consumer behavior theories, energy efficiency, household occupants, residential consumer

Procedia PDF Downloads 301
4684 Modelling and Management of Vegetal Pest Based On Case of Xylella Fastidiosa in Alicante

Authors: Maria Teresa Signes Pont, Jose Juan Cortes Plana

Abstract:

Our proposal provides suitable modelling to the spread of plant pest and particularly to the propagation of Xylella fastidiosa in the almond trees. We compared the impact of temperature and humidity on the propagation of Xylella fastidiosa in various subspecies. Comparison between Balearic Islands and Alicante (Spain). Most sharpshooter and spittlebug species showed peaks in population density during the month of higher mean temperature and relative humidity (April-October), except for the splittlebug Clastoptera sp.1, whose adult population peaked from September-October (late summer and early autumn). The critical season is from when they hatch from the eggs until they are in the pre-reproductive season (January -April) to expand. We focused on winters in the egg state, which normally hatches in early March. The nymphs secrete a foam (mucilage) in which they live and that protects them from natural enemies of temperature changes and prevents dry as long as the humidity is above 75%. The interaction between the life cycles of vectors and vegetation influences the food preferences of vectors and is responsible for the general seasonal shift of the population from vegetation to trees and vice versa, In addition to the temperature maps, we have observed humidity as it affects the spread of the pest Xylella fastidiosa (Xf).

Keywords: xylella fastidiosa, almod tree, temperature, humidity, environmental model

Procedia PDF Downloads 146
4683 A Nonlinear Visco-Hyper Elastic Constitutive Model for Modelling Behavior of Polyurea at Large Deformations

Authors: Shank Kulkarni, Alireza Tabarraei

Abstract:

The fantastic properties of polyurea such as flexibility, durability, and chemical resistance have brought it a wide range of application in various industries. Effective prediction of the response of polyurea under different loading and environmental conditions necessitates the development of an accurate constitutive model. Similar to most polymers, the behavior of polyurea depends on both strain and strain rate. Therefore, the constitutive model should be able to capture both these effects on the response of polyurea. To achieve this objective, in this paper, a nonlinear hyper-viscoelastic constitutive model is developed by the superposition of a hyperelastic and a viscoelastic model. The proposed constitutive model can capture the behavior of polyurea under compressive loading conditions at various strain rates. Four parameter Ogden model and Mooney Rivlin model are used to modeling the hyperelastic behavior of polyurea. The viscoelastic behavior is modeled using both a three-parameter standard linear solid (SLS) model and a K-BKZ model. Comparison of the modeling results with experiments shows that Odgen and SLS model can more accurately predict the behavior of polyurea. The material parameters of the model are found by curve fitting of the proposed model to the uniaxial compression test data. The proposed model can closely reproduce the stress-strain behavior of polyurea for strain rates up to 6500 /s.

Keywords: constitutive modelling, ogden model, polyurea, SLS model, uniaxial compression test

Procedia PDF Downloads 216