Search results for: reduce order aeroelastic model (ROAM)
30277 Estimation of the Parameters of Muskingum Methods for the Prediction of the Flood Depth in the Moudjar River Catchment
Authors: Fares Laouacheria, Said Kechida, Moncef Chabi
Abstract:
The objective of the study was based on the hydrological routing modelling for the continuous monitoring of the hydrological situation in the Moudjar river catchment, especially during floods with Hydrologic Engineering Center–Hydrologic Modelling Systems (HEC-HMS). The HEC-GeoHMS was used to transform data from geographic information system (GIS) to HEC-HMS for delineating and modelling the catchment river in order to estimate the runoff volume, which is used as inputs to the hydrological routing model. Two hydrological routing models were used, namely Muskingum and Muskingum routing models, for conducting this study. In this study, a comparison between the parameters of the Muskingum and Muskingum-Cunge routing models in HEC-HMS was used for modelling flood routing in the Moudjar river catchment and determining the relationship between these parameters and the physical characteristics of the river. The results indicate that the effects of input parameters such as the weighting factor "X" and travel time "K" on the output results are more significant, where the Muskingum routing model was more sensitive to input parameters than the Muskingum-Cunge routing model. This study can contribute to understand and improve the knowledge of the mechanisms of river floods, especially in ungauged river catchments.Keywords: HEC-HMS, hydrological modelling, Muskingum routing model, Muskingum-Cunge routing model
Procedia PDF Downloads 27830276 Automatic Tuning for a Systemic Model of Banking Originated Losses (SYMBOL) Tool on Multicore
Authors: Ronal Muresano, Andrea Pagano
Abstract:
Nowadays, the mathematical/statistical applications are developed with more complexity and accuracy. However, these precisions and complexities have brought as result that applications need more computational power in order to be executed faster. In this sense, the multicore environments are playing an important role to improve and to optimize the execution time of these applications. These environments allow us the inclusion of more parallelism inside the node. However, to take advantage of this parallelism is not an easy task, because we have to deal with some problems such as: cores communications, data locality, memory sizes (cache and RAM), synchronizations, data dependencies on the model, etc. These issues are becoming more important when we wish to improve the application’s performance and scalability. Hence, this paper describes an optimization method developed for Systemic Model of Banking Originated Losses (SYMBOL) tool developed by the European Commission, which is based on analyzing the application's weakness in order to exploit the advantages of the multicore. All these improvements are done in an automatic and transparent manner with the aim of improving the performance metrics of our tool. Finally, experimental evaluations show the effectiveness of our new optimized version, in which we have achieved a considerable improvement on the execution time. The time has been reduced around 96% for the best case tested, between the original serial version and the automatic parallel version.Keywords: algorithm optimization, bank failures, OpenMP, parallel techniques, statistical tool
Procedia PDF Downloads 37030275 Derivation of Bathymetry from High-Resolution Satellite Images: Comparison of Empirical Methods through Geographical Error Analysis
Authors: Anusha P. Wijesundara, Dulap I. Rathnayake, Nihal D. Perera
Abstract:
Bathymetric information is fundamental importance to coastal and marine planning and management, nautical navigation, and scientific studies of marine environments. Satellite-derived bathymetry data provide detailed information in areas where conventional sounding data is lacking and conventional surveys are inaccessible. The two empirical approaches of log-linear bathymetric inversion model and non-linear bathymetric inversion model are applied for deriving bathymetry from high-resolution multispectral satellite imagery. This study compares these two approaches by means of geographical error analysis for the site Kankesanturai using WorldView-2 satellite imagery. Based on the Levenberg-Marquardt method calibrated the parameters of non-linear inversion model and the multiple-linear regression model was applied to calibrate the log-linear inversion model. In order to calibrate both models, Single Beam Echo Sounding (SBES) data in this study area were used as reference points. Residuals were calculated as the difference between the derived depth values and the validation echo sounder bathymetry data and the geographical distribution of model residuals was mapped. The spatial autocorrelation was calculated by comparing the performance of the bathymetric models and the results showing the geographic errors for both models. A spatial error model was constructed from the initial bathymetry estimates and the estimates of autocorrelation. This spatial error model is used to generate more reliable estimates of bathymetry by quantifying autocorrelation of model error and incorporating this into an improved regression model. Log-linear model (R²=0.846) performs better than the non- linear model (R²=0.692). Finally, the spatial error models improved bathymetric estimates derived from linear and non-linear models up to R²=0.854 and R²=0.704 respectively. The Root Mean Square Error (RMSE) was calculated for all reference points in various depth ranges. The magnitude of the prediction error increases with depth for both the log-linear and the non-linear inversion models. Overall RMSE for log-linear and the non-linear inversion models were ±1.532 m and ±2.089 m, respectively.Keywords: log-linear model, multi spectral, residuals, spatial error model
Procedia PDF Downloads 29830274 Analysis of Consumer Preferences for Housing in Saudi Arabia
Authors: Mohammad Abdulaziz Algrnas, Emma Mulliner
Abstract:
Housing projects have been established in Saudi Arabia, by both government and private construction companies, to meet the increasing demand from Saudi inhabitants across the country. However, the real estate market supply does not meet consumer preference requirements. Preferences normally differ depending on the consumer’s situation, such as the household’s sociological characteristics (age, household size and composition), resources (income, wealth, information and experience), tastes and priorities. Collecting information about consumer attitudes, preferences and perceptions is important for the real estate market in order to better understand housing demand and to ensure that this is met by appropriate supply. The aim of this paper is to identify consumer preferences for housing in Saudi Arabia. A quantitative closed-ended questionnaire was conducted with housing consumers in Saudi Arabia in order to gain insight into consumer needs, current household situation, preferences for a number of investigated housing attributes and consumers’ perceptions around the current housing problem. 752 survey responses were obtained and analysed in order to describe preferences for housing attributes and make comparisons between groups. Factor analysis was also conducted to identify and reduce the attributes. The results indicate a difference in preference according to the gender of the respondents and depending on their region of residence.Keywords: housing attributes, Saudi Arabia, consumer preferences, housing preferences
Procedia PDF Downloads 54330273 VideoAssist: A Labelling Assistant to Increase Efficiency in Annotating Video-Based Fire Dataset Using a Foundation Model
Authors: Keyur Joshi, Philip Dietrich, Tjark Windisch, Markus König
Abstract:
In the field of surveillance-based fire detection, the volume of incoming data is increasing rapidly. However, the labeling of a large industrial dataset is costly due to the high annotation costs associated with current state-of-the-art methods, which often require bounding boxes or segmentation masks for model training. This paper introduces VideoAssist, a video annotation solution that utilizes a video-based foundation model to annotate entire videos with minimal effort, requiring the labeling of bounding boxes for only a few keyframes. To the best of our knowledge, VideoAssist is the first method to significantly reduce the effort required for labeling fire detection videos. The approach offers bounding box and segmentation annotations for the video dataset with minimal manual effort. Results demonstrate that the performance of labels annotated by VideoAssist is comparable to those annotated by humans, indicating the potential applicability of this approach in fire detection scenarios.Keywords: fire detection, label annotation, foundation models, object detection, segmentation
Procedia PDF Downloads 1730272 Modeling the Time-Dependent Rheological Behavior of Clays Used in Fabrication of Ceramic
Authors: Larbi Hammadi, N. Boudjenane, N. Benhallou, R. Houjedje, R. Reffis, M. Belhadri
Abstract:
Many of clays exhibited the thixotropic behavior in which, the apparent viscosity of material decreases with time of shearing at constant shear rate. The structural kinetic model (SKM) was used to characterize the thixotropic behavior of two different kinds of clays used in fabrication of ceramic. Clays selected for analysis represent the fluid and semisolid clays materials. The SKM postulates that the change in the rheological behavior is associated with shear-induced breakdown of the internal structure of the clays. This model for the structure decay with time at constant shear rate assumes nth order kinetics for the decay of the material structure with a rate constant.Keywords: ceramic, clays, structural kinetic model, thixotropy, viscosity
Procedia PDF Downloads 41030271 Seismic Performance of Reinforced Concrete Frame Structure Based on Plastic Rotation
Authors: Kahil Amar, Meziani Faroudja, Khelil Nacim
Abstract:
The principal objective of this study is the evaluation of the seismic performance of reinforced concrete frame structures, taking into account of the behavior laws, reflecting the real behavior of materials, using CASTEM2000 software. A finite element model used is based in modified Takeda model with Timoshenko elements for columns and beams. This model is validated on a Vecchio experimental reinforced concrete (RC) frame model. Then, a study focused on the behavior of a RC frame with three-level and three-story in order to visualize the positioning the plastic hinge (plastic rotation), determined from the curvature distribution along the elements. The results obtained show that the beams of the 1st and 2nd level developed a very large plastic rotations, or these rotations exceed the values corresponding to CP (Collapse prevention with cp qCP = 0.02 rad), against those developed at the 3rd level, are between IO and LS (Immediate occupancy and life Safety with qIO = 0.005 rad and rad qLS = 0.01 respectively), so the beams of first and second levels submit a very significant damage.Keywords: seismic performance, performance level, pushover analysis, plastic rotation, plastic hinge
Procedia PDF Downloads 13030270 Multiscale Modelling of Citrus Black Spot Transmission Dynamics along the Pre-Harvest Supply Chain
Authors: Muleya Nqobile, Winston Garira
Abstract:
We presented a compartmental deterministic multi-scale model which encompass internal plant defensive mechanism and pathogen interaction, then we consider nesting the model into the epidemiological model. The objective was to improve our understanding of the transmission dynamics of within host and between host of Guignardia citricapa Kiely. The inflow of infected class was scaled down to individual level while the outflow was scaled up to average population level. Conceptual model and mathematical model were constructed to display a theoretical framework which can be used for predicting or identify disease pattern.Keywords: epidemiological model, mathematical modelling, multi-scale modelling, immunological model
Procedia PDF Downloads 46030269 Numerical Investigation of Geotextile Application in Clay Reinforcement in ABAQUS Software
Authors: Seyed Abolhasan Naeini, Eisa Aliagahei
Abstract:
Today, the use of geosynthetic materials in geotechnical activities is increasing significantly. One of the main uses of these materials is to increase the compressive strength of clay reinforced by geotextile layers. In the present study, the effect of clay reinforcement by geotextile layers in increasing the compressive strength of clay has been investigated using modeling in ABAQUS 6.11.3 software. For this purpose, the modified Drager Prager model has been chosen to simulate the stress-strain behavior of soil layers and the linear elastic model for the geotextile layer. Unreinforced samples and reinforced samples are modeled by geotextile layers (1, 2 and 3 geotextile layers) by software. In order to validate the results, an article in the same field was used and the numerical modeling results were calibrated with the laboratory results. Based on the obtained results, the software has a suitable capability for modeling and the results of the numerical model overlap with the laboratory results to a very acceptable extent, by increasing the number of geotextile layers, the error between the results of the laboratory sample and the software model increases. The highest amount of error is related to the sample reinforced with three layers of geotextile and is 7.3%.Keywords: Abaqus, cap model, clay, geotextile layer, reinforced soil
Procedia PDF Downloads 8830268 Seismic Fragility for Sliding Failure of Weir Structure Considering the Process of Concrete Aging
Authors: HoYoung Son, Ki Young Kim, Woo Young Jung
Abstract:
This study investigated the change of weir structure performances when durability of concrete, which is the main material of weir structure, decreased due to their aging by mean of seismic fragility analysis. In the analysis, it was assumed that the elastic modulus of concrete was reduced by 10% in order to account for their aged deterioration. Additionally, the analysis of seismic fragility was based on Monte Carlo Simulation method combined with a 2D nonlinear finite element in ABAQUS platform with the consideration of deterioration of concrete. Finally, the comparison of seismic fragility of model pre- and post-deterioration was made to study the performance of weir. Results show that the probability of failure in moderate damage for deteriorated model was found to be larger than pre-deterioration model when peak ground acceleration (PGA) passed 0.4 g.Keywords: weir, FEM, concrete, fragility, aging
Procedia PDF Downloads 42530267 A Novel Model for Saturation Velocity Region of Graphene Nanoribbon Transistor
Authors: Mohsen Khaledian, Razali Ismail, Mehdi Saeidmanesh, Mahdiar Hosseinghadiry
Abstract:
A semi-analytical model for impact ionization coefficient of graphene nanoribbon (GNR) is presented. The model is derived by calculating probability of electrons reaching ionization threshold energy Et and the distance traveled by electron gaining Et. In addition, ionization threshold energy is semi-analytically modeled for GNR. We justify our assumptions using analytic modeling and comparison with simulation results. Gaussian simulator together with analytical modeling is used in order to calculate ionization threshold energy and Kinetic Monte Carlo is employed to calculate ionization coefficient and verify the analytical results. Finally, the profile of ionization is presented using the proposed models and simulation and the results are compared with that of silicon.Keywords: nanostructures, electronic transport, semiconductor modeling, systems engineering
Procedia PDF Downloads 47430266 The Impact of Supply Chain Strategy and Integration on Supply Chain Performance: Supply Chain Vulnerability as a Moderator
Authors: Yi-Chun Kuo, Jo-Chieh Lin
Abstract:
The objective of a supply chain strategy is to reduce waste and increase efficiency to attain cost benefits, and to guarantee supply chain flexibility when facing the ever-changing market environment in order to meet customer requirements. Strategy implementation aims to fulfill common goals and attain benefits by integrating upstream and downstream enterprises, sharing information, conducting common planning, and taking part in decision making, so as to enhance the overall performance of the supply chain. With the rise of outsourcing and globalization, the increasing dependence on suppliers and customers and the rapid development of information technology, the complexity and uncertainty of the supply chain have intensified, and supply chain vulnerability has surged, resulting in adverse effects on supply chain performance. Thus, this study aims to use supply chain vulnerability as a moderating variable and apply structural equation modeling (SEM) to determine the relationships among supply chain strategy, supply chain integration, and supply chain performance, as well as the moderating effect of supply chain vulnerability on supply chain performance. The data investigation of this study was questionnaires which were collected from the management level of enterprises in Taiwan and China, 149 questionnaires were received. The result of confirmatory factor analysis shows that the path coefficients of supply chain strategy on supply chain integration and supply chain performance are positive (0.497, t= 4.914; 0.748, t= 5.919), having a significantly positive effect. Supply chain integration is also significantly positively correlated to supply chain performance (0.192, t = 2.273). The moderating effects of supply chain vulnerability on supply chain strategy and supply chain integration to supply chain performance are significant (7.407; 4.687). In Taiwan, 97.73% of enterprises are small- and medium-sized enterprises (SMEs) focusing on receiving original equipment manufacturer (OEM) and original design manufacturer (ODM) orders. In order to meet the needs of customers and to respond to market changes, these enterprises especially focus on supply chain flexibility and their integration with the upstream and downstream enterprises. According to the observation of this research, the effect of supply chain vulnerability on supply chain performance is significant, and so enterprises need to attach great importance to the management of supply chain risk and conduct risk analysis on their suppliers in order to formulate response strategies when facing emergency situations. At the same time, risk management is incorporated into the supply chain so as to reduce the effect of supply chain vulnerability on the overall supply chain performance.Keywords: supply chain integration, supply chain performance, supply chain vulnerability, structural equation modeling
Procedia PDF Downloads 31830265 The Future of Insurance: P2P Innovation versus Traditional Business Model
Authors: Ivan Sosa Gomez
Abstract:
Digitalization has impacted the entire insurance value chain, and the growing movement towards P2P platforms and the collaborative economy is also beginning to have a significant impact. P2P insurance is defined as innovation, enabling policyholders to pool their capital, self-organize, and self-manage their own insurance. In this context, new InsurTech start-ups are emerging as peer-to-peer (P2P) providers, based on a model that differs from traditional insurance. As a result, although P2P platforms do not change the fundamental basis of insurance, they do enable potentially more efficient business models to be established in terms of ensuring the coverage of risk. It is therefore relevant to determine whether p2p innovation can have substantial effects on the future of the insurance sector. For this purpose, it is considered necessary to develop P2P innovation from a business perspective, as well as to build a comparison between a traditional model and a P2P model from an actuarial perspective. Objectives: The objectives are (1) to represent P2P innovation in the business model compared to the traditional insurance model and (2) to establish a comparison between a traditional model and a P2P model from an actuarial perspective. Methodology: The research design is defined as action research in terms of understanding and solving the problems of a collectivity linked to an environment, applying theory and best practices according to the approach. For this purpose, the study is carried out through the participatory variant, which involves the collaboration of the participants, given that in this design, participants are considered experts. For this purpose, prolonged immersion in the field is carried out as the main instrument for data collection. Finally, an actuarial model is developed relating to the calculation of premiums that allows for the establishment of projections of future scenarios and the generation of conclusions between the two models. Main Contributions: From an actuarial and business perspective, we aim to contribute by developing a comparison of the two models in the coverage of risk in order to determine whether P2P innovation can have substantial effects on the future of the insurance sector.Keywords: Insurtech, innovation, business model, P2P, insurance
Procedia PDF Downloads 9330264 Prediction for the Pressure Drop of Gas-Liquid Cylindrical Cyclone in Sub-Sea Production System
Authors: Xu Rumin, Chen Jianyi, Yue Ti, Wang Yaan
Abstract:
With the rapid development of subsea oil and gas exploitation, the demand for the related underwater process equipment is increasing fast. In order to reduce the energy consuming, people tend to separate the gas and oil phase directly on the seabed. Accordingly, an advanced separator is needed. In this paper, the pressure drop of a new type of separator named Gas Liquid Cylindrical Cyclone (GLCC) which is used in the subsea system is investigated by both experiments and numerical simulation. In the experiments, the single phase flow and gas-liquid two phase flow in GLCC were tested. For the simulation, the performance of GLCC under both laboratory and industrial conditions was calculated. The Eulerian model was implemented to describe the mixture flow field in the GLCC under experimental conditions and industrial oil-natural gas conditions. Furthermore, a relationship among Euler number (Eu), Reynolds number (Re), and Froude number (Fr) is generated according to similarity analysis and simulation data, which can present the GLCC separation performance of pressure drop. These results can give reference to the design and application of GLCC in deep sea.Keywords: dimensionless analysis, gas-liquid cylindrical cyclone, numerical simulation, pressure drop
Procedia PDF Downloads 17130263 Analysis the Trajectory of the Spacecraft during the Transition to the Planet's Orbit Using Aerobraking in the Atmosphere of the Planet
Authors: Zaw Min Tun
Abstract:
The paper focuses on the spacecraft’s trajectory transition from interplanetary hyperbolic orbit to the planet’s orbit using the aerobraking in the atmosphere of the planet. A considerable mass of fuel is consumed during the spacecraft transition from the planet’s gravitation assist trajectory into the planet’s satellite orbit. To reduce the fuel consumption in this transition need to slow down the spacecraft’s velocity in the planet’s atmosphere and reduce its orbital transition time. The paper is devoted to the use of the planet’s atmosphere for slowing down the spacecraft during its transition into the satellite orbit with uncertain atmospheric parameters. To reduce the orbital transition time of the spacecraft is controlled by the change of attack angles’ values at the aerodynamic deceleration path and adjusting the minimum flight altitude of the spacecraft at the pericenter of the planet’s upper atmosphere.Keywords: aerobraking, atmosphere of the planet, orbital transition time, Spacecraft’s trajectory
Procedia PDF Downloads 30530262 Fertigation Use in Agriculture and Biosorption of Residual Nitrogen by Soil Microorganisms
Authors: Irina Mikajlo, Jakub Elbl, Helena Dvořáčková, Antonín Kintl, Jindřich Kynický, Martin Brtnický, Jaroslav Záhora
Abstract:
Present work deals with the possible use of fertigation in agriculture and its impact on the availability of mineral nitrogen (Nmin) in topsoil and subsoil horizons. The aim of the present study is to demonstrate the effect of the organic matter presence in fertigation on microbial transformation and availability of mineral nitrogen forms. The main investigation reason is the potential use of pre-treated waste water, as a source of organic carbon (Corg) and residual nutrients (Nmin) for fertigation. Laboratory experiment has been conducted to demonstrate the effect of the arable land fertilization method on the Nmin availability in different depths of the soil with the usage of model experimental containers filled with soil from topsoil and podsoil horizons that were taken from the precise area. Tufted hairgrass (Deschampsia caespitosa) has been chosen as a model plant. The water source protection zone Brezova nad Svitavou has been a research area where significant underground reservoirs of drinking water of the highest quality are located. From the second half of the last century local sources of drinking water show nitrogenous compounds increase that get here almost only from arable lands. Therefore, an attention of the following text focuses on the fate of mineral nitrogen in the complex plant-soil. Research results show that the fertigation application with Corg in a combination with mineral fertilizer can reduce the amount of Nmin leached from topsoil horizon of agricultural soils. In addition, some plants biomass production reduce may occur.Keywords: fertigation, fertilizers, mineral nitrogen, soil microorganisms
Procedia PDF Downloads 35230261 Sustainable Urban Landscape Practices: A New Concept to Reduce Ecological Degradation
Authors: Manjari Rai
Abstract:
Urbanization is an inevitable process of development of human society and an outcome of economic development and scientific and technological progress. While urbanization process in promoting the development of human civilization, also no doubt, urban landscape has been a corresponding impact. Urban environment has suffered unprecedented damage majorly due to the increase in urban population density and heavy migration rate, traffic congestion, and environmental pollution. All this have however led to a major ecological degradation and imbalance. As lands are used for the rapid and unplanned urbanization, the green lands are diminished, and severe pollution is created by waste products. Plastic, the most alarming waste at landfill sites, is yet uncontrolled. Therefore, initiatives must be taken to reduce plastic mediated pollution and increase green application. However, increasing green land is not possible due to the landfill by urban structures. In order to create a harmonious environment, sustainable development in the urban landscape becomes a matter of prime focus. This paper thus discusses the concept of ecological design combined with the urban landscape design, green landscape design on urban structures and sustainable development through the use of recyclable waste materials which is also a low costing approach of urban landscape design.Keywords: ecological, degradation sustainable, landscape, urban
Procedia PDF Downloads 42530260 A Location Routing Model for the Logistic System in the Mining Collection Centers of the Northern Region of Boyacá-Colombia
Authors: Erika Ruíz, Luis Amaya, Diego Carreño
Abstract:
The main objective of this study is to design a mathematical model for the logistics of mining collection centers in the northern region of the department of Boyacá (Colombia), determining the structure that facilitates the flow of products along the supply chain. In order to achieve this, it is necessary to define a suitable design of the distribution network, taking into account the products, customer’s characteristics and the availability of information. Likewise, some other aspects must be defined, such as number and capacity of collection centers to establish, routes that must be taken to deliver products to the customers, among others. This research will use one of the operation research problems, which is used in the design of distribution networks known as Location Routing Problem (LRP).Keywords: location routing problem, logistic, mining collection, model
Procedia PDF Downloads 21830259 Companies’ Internationalization: Multi-Criteria-Based Prioritization Using Fuzzy Logic
Authors: Jorge Anibal Restrepo Morales, Sonia Martín Gómez
Abstract:
A model based on a logical framework was developed to quantify SMEs' internationalization capacity. To do so, linguistic variables, such as human talent, infrastructure, innovation strategies, FTAs, marketing strategies, finance, etc. were integrated. It is argued that a company’s management of international markets depends on internal factors, especially capabilities and resources available. This study considers internal factors as the biggest business challenge because they force companies to develop an adequate set of capabilities. At this stage, importance and strategic relevance have to be defined in order to build competitive advantages. A fuzzy inference system is proposed to model the resources, skills, and capabilities that determine the success of internationalization. Data: 157 linguistic variables were used. These variables were defined by international trade entrepreneurs, experts, consultants, and researchers. Using expert judgment, the variables were condensed into18 factors that explain SMEs’ export capacity. The proposed model is applied by means of a case study of the textile and clothing cluster in Medellin, Colombia. In the model implementation, a general index of 28.2 was obtained for internationalization capabilities. The result confirms that the sector’s current capabilities and resources are not sufficient for a successful integration into the international market. The model specifies the factors and variables, which need to be worked on in order to improve export capability. In the case of textile companies, the lack of a continuous recording of information stands out. Likewise, there are very few studies directed towards developing long-term plans, and., there is little consistency in exports criteria. This method emerges as an innovative management tool linked to internal organizational spheres and their different abilities.Keywords: business strategy, exports, internationalization, fuzzy set methods
Procedia PDF Downloads 29630258 Test of Capital Account Monetary Model of Floating Exchange Rate Determination: Further Evidence from Selected African Countries
Authors: Oloyede John Adebayo
Abstract:
This paper tested a variant of the monetary model of exchange rate determination, called Frankel’s Capital Account Monetary Model (CAAM) based on Real Interest Rate Differential, on the floating exchange rate experiences of three developing countries of Africa; viz: Ghana, Nigeria and the Gambia. The study adopted the Auto regressive Instrumental Package (AIV) and Almon Polynomial Lag Procedure of regression analysis based on the assumption that the coefficients follow a third-order Polynomial with zero-end constraint. The results found some support for the CAAM hypothesis that exchange rate responds proportionately to changes in money supply, inversely to income and positively to interest rates and expected inflation differentials. On this basis, the study points the attention of monetary authorities and researchers to the relevance and usefulness of CAAM as appropriate tool and useful benchmark for analyzing the exchange rate behaviour of most developing countries.Keywords: exchange rate, monetary model, interest differentials, capital account
Procedia PDF Downloads 41530257 An Optimal Control Method for Reconstruction of Topography in Dam-Break Flows
Authors: Alia Alghosoun, Nabil El Moçayd, Mohammed Seaid
Abstract:
Modeling dam-break flows over non-flat beds requires an accurate representation of the topography which is the main source of uncertainty in the model. Therefore, developing robust and accurate techniques for reconstructing topography in this class of problems would reduce the uncertainty in the flow system. In many hydraulic applications, experimental techniques have been widely used to measure the bed topography. In practice, experimental work in hydraulics may be very demanding in both time and cost. Meanwhile, computational hydraulics have served as an alternative for laboratory and field experiments. Unlike the forward problem, the inverse problem is used to identify the bed parameters from the given experimental data. In this case, the shallow water equations used for modeling the hydraulics need to be rearranged in a way that the model parameters can be evaluated from measured data. However, this approach is not always possible and it suffers from stability restrictions. In the present work, we propose an adaptive optimal control technique to numerically identify the underlying bed topography from a given set of free-surface observation data. In this approach, a minimization function is defined to iteratively determine the model parameters. The proposed technique can be interpreted as a fractional-stage scheme. In the first stage, the forward problem is solved to determine the measurable parameters from known data. In the second stage, the adaptive control Ensemble Kalman Filter is implemented to combine the optimality of observation data in order to obtain the accurate estimation of the topography. The main features of this method are on one hand, the ability to solve for different complex geometries with no need for any rearrangements in the original model to rewrite it in an explicit form. On the other hand, its achievement of strong stability for simulations of flows in different regimes containing shocks or discontinuities over any geometry. Numerical results are presented for a dam-break flow problem over non-flat bed using different solvers for the shallow water equations. The robustness of the proposed method is investigated using different numbers of loops, sensitivity parameters, initial samples and location of observations. The obtained results demonstrate high reliability and accuracy of the proposed techniques.Keywords: erodible beds, finite element method, finite volume method, nonlinear elasticity, shallow water equations, stresses in soil
Procedia PDF Downloads 13030256 Domain-Specific Deep Neural Network Model for Classification of Abnormalities on Chest Radiographs
Authors: Nkechinyere Joy Olawuyi, Babajide Samuel Afolabi, Bola Ibitoye
Abstract:
This study collected a preprocessed dataset of chest radiographs and formulated a deep neural network model for detecting abnormalities. It also evaluated the performance of the formulated model and implemented a prototype of the formulated model. This was with the view to developing a deep neural network model to automatically classify abnormalities in chest radiographs. In order to achieve the overall purpose of this research, a large set of chest x-ray images were sourced for and collected from the CheXpert dataset, which is an online repository of annotated chest radiographs compiled by the Machine Learning Research Group, Stanford University. The chest radiographs were preprocessed into a format that can be fed into a deep neural network. The preprocessing techniques used were standardization and normalization. The classification problem was formulated as a multi-label binary classification model, which used convolutional neural network architecture to make a decision on whether an abnormality was present or not in the chest radiographs. The classification model was evaluated using specificity, sensitivity, and Area Under Curve (AUC) score as the parameter. A prototype of the classification model was implemented using Keras Open source deep learning framework in Python Programming Language. The AUC ROC curve of the model was able to classify Atelestasis, Support devices, Pleural effusion, Pneumonia, A normal CXR (no finding), Pneumothorax, and Consolidation. However, Lung opacity and Cardiomegaly had a probability of less than 0.5 and thus were classified as absent. Precision, recall, and F1 score values were 0.78; this implies that the number of False Positive and False Negative is the same, revealing some measure of label imbalance in the dataset. The study concluded that the developed model is sufficient to classify abnormalities present in chest radiographs into present or absent.Keywords: transfer learning, convolutional neural network, radiograph, classification, multi-label
Procedia PDF Downloads 13030255 Piezoelectric based Passive Vibration Control of Composite Turbine Blade using Shunt Circuit
Authors: Kouider Bendine, Zouaoui Satla, Boukhoulda Farouk Benallel, Shun-Qi Zhang
Abstract:
Turbine blades are subjected to a variety of loads, lead to an undesirable vibration. Such vibration can cause serious damages or even lead to a total failure of the blade. The present paper addresses the vibration control of turbine blade. The study aims to propose a passive vibration control using piezoelectric material. the passive control is effectuated by shunting an RL circuit to the piezoelectric patch in a parallel configuration. To this end, a Finite element model for the blade with the piezoelectric patch is implemented in ANSYS APDL. The model is then subjected to a harmonic frequency-based analysis for the case of control on and off. The results show that the proposed methodology was able to reduce blade vibration by 18%.Keywords: blade, active piezoelectric vibration control, finite element., shunt circuit
Procedia PDF Downloads 10430254 Application of Ultrasonic Assisted Machining Technique for Glass-Ceramic Milling
Authors: S. Y. Lin, C. H. Kuan, C. H. She, W. T. Wang
Abstract:
In this study, ultrasonic assisted machining (UAM) technique is applied in side-surface milling experiment for glass-ceramic workpiece material. The tungsten carbide cutting-tool with diamond coating is used in conjunction with two kinds of cooling/lubrication mediums such as water-soluble (WS) cutting fluid and minimum quantity lubricant (MQL). Full factorial process parameter combinations on the milling experiments are planned to investigate the effect of process parameters on cutting performance. From the experimental results, it tries to search for the better process parameter combination which the edge-indentation and the surface roughness are acceptable. In the machining experiments, ultrasonic oscillator was used to excite a cutting-tool along the radial direction producing a very small amplitude of vibration frequency of 20KHz to assist the machining process. After processing, toolmaker microscope was used to detect the side-surface morphology, edge-indentation and cutting tool wear under different combination of cutting parameters, and analysis and discussion were also conducted for experimental results. The results show that the main leading parameters to edge-indentation of glass ceramic are cutting depth and feed rate. In order to reduce edge-indentation, it needs to use lower cutting depth and feed rate. Water-soluble cutting fluid provides a better cooling effect in the primary cutting area; it may effectively reduce the edge-indentation and improve the surface morphology of the glass ceramic. The use of ultrasonic assisted technique can effectively enhance the surface finish cleanness and reduce cutting tool wear and edge-indentation.Keywords: glass-ceramic, ultrasonic assisted machining, cutting performance, edge-indentation
Procedia PDF Downloads 28530253 A Study for Area-level Mosquito Abundance Prediction by Using Supervised Machine Learning Point-level Predictor
Authors: Theoktisti Makridou, Konstantinos Tsaprailis, George Arvanitakis, Charalampos Kontoes
Abstract:
In the literature, the data-driven approaches for mosquito abundance prediction relaying on supervised machine learning models that get trained with historical in-situ measurements. The counterpart of this approach is once the model gets trained on pointlevel (specific x,y coordinates) measurements, the predictions of the model refer again to point-level. These point-level predictions reduce the applicability of those solutions once a lot of early warning and mitigation actions applications need predictions for an area level, such as a municipality, village, etc... In this study, we apply a data-driven predictive model, which relies on public-open satellite Earth Observation and geospatial data and gets trained with historical point-level in-Situ measurements of mosquito abundance. Then we propose a methodology to extract information from a point-level predictive model to a broader area-level prediction. Our methodology relies on the randomly spatial sampling of the area of interest (similar to the Poisson hardcore process), obtaining the EO and geomorphological information for each sample, doing the point-wise prediction for each sample, and aggregating the predictions to represent the average mosquito abundance of the area. We quantify the performance of the transformation from the pointlevel to the area-level predictions, and we analyze it in order to understand which parameters have a positive or negative impact on it. The goal of this study is to propose a methodology that predicts the mosquito abundance of a given area by relying on point-level prediction and to provide qualitative insights regarding the expected performance of the area-level prediction. We applied our methodology to historical data (of Culex pipiens) of two areas of interest (Veneto region of Italy and Central Macedonia of Greece). In both cases, the results were consistent. The mean mosquito abundance of a given area can be estimated with similar accuracy to the point-level predictor, sometimes even better. The density of the samples that we use to represent one area has a positive effect on the performance in contrast to the actual number of sampling points which is not informative at all regarding the performance without the size of the area. Additionally, we saw that the distance between the sampling points and the real in-situ measurements that were used for training did not strongly affect the performance.Keywords: mosquito abundance, supervised machine learning, culex pipiens, spatial sampling, west nile virus, earth observation data
Procedia PDF Downloads 14930252 An Improved Total Variation Regularization Method for Denoising Magnetocardiography
Authors: Yanping Liao, Congcong He, Ruigang Zhao
Abstract:
The application of magnetocardiography signals to detect cardiac electrical function is a new technology developed in recent years. The magnetocardiography signal is detected with Superconducting Quantum Interference Devices (SQUID) and has considerable advantages over electrocardiography (ECG). It is difficult to extract Magnetocardiography (MCG) signal which is buried in the noise, which is a critical issue to be resolved in cardiac monitoring system and MCG applications. In order to remove the severe background noise, the Total Variation (TV) regularization method is proposed to denoise MCG signal. The approach transforms the denoising problem into a minimization optimization problem and the Majorization-minimization algorithm is applied to iteratively solve the minimization problem. However, traditional TV regularization method tends to cause step effect and lacks constraint adaptability. In this paper, an improved TV regularization method for denoising MCG signal is proposed to improve the denoising precision. The improvement of this method is mainly divided into three parts. First, high-order TV is applied to reduce the step effect, and the corresponding second derivative matrix is used to substitute the first order. Then, the positions of the non-zero elements in the second order derivative matrix are determined based on the peak positions that are detected by the detection window. Finally, adaptive constraint parameters are defined to eliminate noises and preserve signal peak characteristics. Theoretical analysis and experimental results show that this algorithm can effectively improve the output signal-to-noise ratio and has superior performance.Keywords: constraint parameters, derivative matrix, magnetocardiography, regular term, total variation
Procedia PDF Downloads 15330251 Software Quality Measurement System for Telecommunication Industry in Malaysia
Authors: Nor Fazlina Iryani Abdul Hamid, Mohamad Khatim Hasan
Abstract:
Evolution of software quality measurement has been started since McCall introduced his quality model in year 1977. Starting from there, several software quality models and software quality measurement methods had emerged but none of them focused on telecommunication industry. In this paper, the implementation of software quality measurement system for telecommunication industry was compulsory to accommodate the rapid growth of telecommunication industry. The quality value of the telecommunication related software could be calculated using this system by entering the required parameters. The system would calculate the quality value of the measured system based on predefined quality metrics and aggregated by referring to the quality model. It would classify the quality level of the software based on Net Satisfaction Index (NSI). Thus, software quality measurement system was important to both developers and users in order to produce high quality software product for telecommunication industry.Keywords: software quality, quality measurement, quality model, quality metric, net satisfaction index
Procedia PDF Downloads 59330250 Modelling and Optimization of a Combined Sorption Enhanced Biomass Gasification with Hydrothermal Carbonization, Hot Gas Cleaning and Dielectric Barrier Discharge Plasma Reactor to Produce Pure H₂ and Methanol Synthesis
Authors: Vera Marcantonio, Marcello De Falco, Mauro Capocelli, Álvaro Amado-Fierro, Teresa A. Centeno, Enrico Bocci
Abstract:
Concerns about energy security, energy prices, and climate change led scientific research towards sustainable solutions to fossil fuel as renewable energy sources coupled with hydrogen as an energy vector and carbon capture and conversion technologies. Among the technologies investigated in the last decades, biomass gasification acquired great interest owing to the possibility of obtaining low-cost and CO₂ negative emission hydrogen production from a large variety of everywhere available organic wastes. Upstream and downstream treatment were then studied in order to maximize hydrogen yield, reduce the content of organic and inorganic contaminants under the admissible levels for the technologies which are coupled with, capture, and convert carbon dioxide. However, studies which analyse a whole process made of all those technologies are still missing. In order to fill this lack, the present paper investigated the coexistence of hydrothermal carbonization (HTC), sorption enhance gasification (SEG), hot gas cleaning (HGC), and CO₂ conversion by dielectric barrier discharge (DBD) plasma reactor for H₂ production from biomass waste by means of Aspen Plus software. The proposed model aimed to identify and optimise the performance of the plant by varying operating parameters (such as temperature, CaO/biomass ratio, separation efficiency, etc.). The carbon footprint of the global plant is 2.3 kg CO₂/kg H₂, lower than the latest limit value imposed by the European Commission to consider hydrogen as “clean”, that was set to 3 kg CO₂/kg H₂. The hydrogen yield referred to the whole plant is 250 gH₂/kgBIOMASS.Keywords: biomass gasification, hydrogen, aspen plus, sorption enhance gasification
Procedia PDF Downloads 8130249 Proposal for a Generic Context Meta-Model
Authors: Jaouadi Imen, Ben Djemaa Raoudha, Ben Abdallah Hanene
Abstract:
The access to relevant information that is adapted to users’ needs, preferences and environment is a challenge in many applications running. That causes an appearance of context-aware systems. To facilitate the development of this class of applications, it is necessary that these applications share a common context meta-model. In this article, we will present our context meta-model that is defined using the OMG Meta Object facility (MOF). This meta-model is based on the analysis and synthesis of context concepts proposed in literature.Keywords: context, meta-model, MOF, awareness system
Procedia PDF Downloads 56230248 Using Computational Fluid Dynamics to Model and Design a Preventative Application for Strong Wind
Authors: Ming-Hwi Yao, Su-Szu Yang
Abstract:
Typhoons are one of the major types of disasters that affect Taiwan each year and that cause severe damage to agriculture. Indeed, the damage exacted during a typical typhoon season can be up to $1 billion, and is responsible for nearly 75% of yearly agricultural losses. However, there is no consensus on how to reduce the damage caused by the strong winds and heavy precipitation engendered by typhoons. One suggestion is the use of windbreak nets, which are a low-cost and easy-to-use disaster mitigation strategy for crop production. In the present study, we conducted an evaluation to determine the optimal conditions of a windbreak net by using a computational fluid dynamics (CFD) model. This model may be used as a reference for crop protection. The results showed that CFD simulation validated windbreak nets of different mesh sizes and heights in the experimental area; thus, CFD is an efficient tool for evaluating the effectiveness of windbreak nets. Specifically, the effective wind protection length and height were found to be 6 and 1.3 times the length and height of the windbreak net, respectively. During a real typhoon, maximum wind gusts of 18 m s-1 can be reduced to 4 m s-1 by using a windbreak net that has a 70% blocking rate. In short, windbreak nets are significantly effective in protecting typhoon-affected areas.Keywords: computational fluid dynamics, disaster, typhoon, windbreak net
Procedia PDF Downloads 192