Search results for: lumped parameters model
21477 Technical and Economic Analysis Effects of Various Parameters on the Performance of Heat Recovery System on Gas Complex Turbo Generators
Authors: Hefzollah Mohammadian, Mohammad Bagher Heidari
Abstract:
This paper deals with the technical and economic effects of various parameters on the performance of heat recovery system on gas complex turbo generator. Given the importance of this issue, that is the main goal of economic efficiency and reduces costs; this project has been implemented similar plans in which the target is the implementation of specific patterns. The project will also help us in the process of gas refineries and the actual efficiency of the process after adding a system to analyze the turbine and predict potential problems and fix them and take appropriate measures according to the results of simulation analysis and results of the process gain. The results of modeling and the effect of different parameters on this line, have been done using Thermo Flow.Keywords: turbo compressor, turbo generator, heat recovery boiler, gas turbines
Procedia PDF Downloads 30421476 A Regression Model for Residual-State Creep Failure
Authors: Deepak Raj Bhat, Ryuichi Yatabe
Abstract:
In this study, a residual-state creep failure model was developed based on the residual-state creep test results of clayey soils. To develop the proposed model, the regression analyses were done by using the R. The model results of the failure time (tf) and critical displacement (δc) were compared with experimental results and found in close agreements to each others. It is expected that the proposed regression model for residual-state creep failure will be more useful for the prediction of displacement of different clayey soils in the future.Keywords: regression model, residual-state creep failure, displacement prediction, clayey soils
Procedia PDF Downloads 40821475 Towards A New Maturity Model for Information System
Authors: Ossama Matrane
Abstract:
Information System has become a strategic lever for enterprises. It contributes effectively to align business processes on strategies of enterprises. It is regarded as an increase in productivity and effectiveness. So, many organizations are currently involved in implementing sustainable Information System. And, a large number of studies have been conducted the last decade in order to define the success factors of information system. Thus, many studies on maturity model have been carried out. Some of this study is referred to the maturity model of Information System. In this article, we report on development of maturity models specifically designed for information system. This model is built based on three components derived from Maturity Model for Information Security Management, OPM3 for Project Management Maturity Model and processes of COBIT for IT governance. Thus, our proposed model defines three maturity stages for corporate a strong Information System to support objectives of organizations. It provides a very practical structure with which to assess and improve Information System Implementation.Keywords: information system, maturity models, information security management, OPM3, IT governance
Procedia PDF Downloads 44721474 A Model Architecture Transformation with Approach by Modeling: From UML to Multidimensional Schemas of Data Warehouses
Authors: Ouzayr Rabhi, Ibtissam Arrassen
Abstract:
To provide a complete analysis of the organization and to help decision-making, leaders need to have relevant data; Data Warehouses (DW) are designed to meet such needs. However, designing DW is not trivial and there is no formal method to derive a multidimensional schema from heterogeneous databases. In this article, we present a Model-Driven based approach concerning the design of data warehouses. We describe a multidimensional meta-model and also specify a set of transformations starting from a Unified Modeling Language (UML) metamodel. In this approach, the UML metamodel and the multidimensional one are both considered as a platform-independent model (PIM). The first meta-model is mapped into the second one through transformation rules carried out by the Query View Transformation (QVT) language. This proposal is validated through the application of our approach to generating a multidimensional schema of a Balanced Scorecard (BSC) DW. We are interested in the BSC perspectives, which are highly linked to the vision and the strategies of an organization.Keywords: data warehouse, meta-model, model-driven architecture, transformation, UML
Procedia PDF Downloads 16021473 Instant Location Detection of Objects Moving at High Speed in C-OTDR Monitoring Systems
Authors: Andrey V. Timofeev
Abstract:
The practical efficient approach is suggested to estimate the high-speed objects instant bounds in C-OTDR monitoring systems. In case of super-dynamic objects (trains, cars) is difficult to obtain the adequate estimate of the instantaneous object localization because of estimation lag. In other words, reliable estimation coordinates of monitored object requires taking some time for data observation collection by means of C-OTDR system, and only if the required sample volume will be collected the final decision could be issued. But it is contrary to requirements of many real applications. For example, in rail traffic management systems we need to get data off the dynamic objects localization in real time. The way to solve this problem is to use the set of statistical independent parameters of C-OTDR signals for obtaining the most reliable solution in real time. The parameters of this type we can call as 'signaling parameters' (SP). There are several the SP’s which carry information about dynamic objects instant localization for each of C-OTDR channels. The problem is that some of these parameters are very sensitive to dynamics of seismoacoustic emission sources but are non-stable. On the other hand, in case the SP is very stable it becomes insensitive as a rule. This report contains describing the method for SP’s co-processing which is designed to get the most effective dynamic objects localization estimates in the C-OTDR monitoring system framework.Keywords: C-OTDR-system, co-processing of signaling parameters, high-speed objects localization, multichannel monitoring systems
Procedia PDF Downloads 47021472 Evaluation of Geomechanical and Geometrical Parameters’ Effects on Hydro-Mechanical Estimation of Water Inflow into Underground Excavations
Authors: M. Mazraehli, F. Mehrabani, S. Zare
Abstract:
In general, mechanical and hydraulic processes are not independent of each other in jointed rock masses. Therefore, the study on hydro-mechanical coupling of geomaterials should be a center of attention in rock mechanics. Rocks in their nature contain discontinuities whose presence extremely influences mechanical and hydraulic characteristics of the medium. Assuming this effect, experimental investigations on intact rock cannot help to identify jointed rock mass behavior. Hence, numerical methods are being used for this purpose. In this paper, water inflow into a tunnel under significant water table has been estimated using hydro-mechanical discrete element method (HM-DEM). Besides, effects of geomechanical and geometrical parameters including constitutive model, friction angle, joint spacing, dip of joint sets, and stress factor on the estimated inflow rate have been studied. Results demonstrate that inflow rates are not identical for different constitutive models. Also, inflow rate reduces with increased spacing and stress factor.Keywords: distinct element method, fluid flow, hydro-mechanical coupling, jointed rock mass, underground excavations
Procedia PDF Downloads 16621471 Location Choice of Firms in an Unequal Length Streets Model: Game Theory Approach as an Extension of the Spoke Model
Authors: Kiumars Shahbazi, Salah Salimian, Abdolrahim Hashemi Dizaj
Abstract:
Locating is one of the key elements in success and survival of industrial centers and has great impact on cost reduction of establishment and launching of various economic activities. In this study, streets with unequal length model have been used that is the classic extension of Spoke model; however with unlimited number of streets with uneven lengths. The results showed that the spoke model is a special case of streets with unequal length model. According to the results of this study, if the strategy of enterprises and firms is to select both price and location, there would be no balance in the game. Furthermore, increased length of streets leads to increased profit of enterprises and with increased number of streets, the enterprises choose locations that are far from center (the maximum differentiation), and the enterprises' output will decrease. Moreover, the enterprise production rate will incline toward zero when the number of streets goes to infinity, and complete competition outcome will be achieved.Keywords: locating, Nash equilibrium, streets with unequal length model, streets with unequal length model
Procedia PDF Downloads 20321470 Taguchi Method for Analyzing a Flexible Integrated Logistics Network
Authors: E. Behmanesh, J. Pannek
Abstract:
Logistics network design is known as one of the strategic decision problems. As these kinds of problems belong to the category of NP-hard problems, traditional ways are failed to find an optimal solution in short time. In this study, we attempt to involve reverse flow through an integrated design of forward/reverse supply chain network that formulated into a mixed integer linear programming. This Integrated, multi-stages model is enriched by three different delivery path which makes the problem more complex. To tackle with such an NP-hard problem a revised random path direct encoding method based memetic algorithm is considered as the solution methodology. Each algorithm has some parameters that need to be investigate to reveal the best performance. In this regard, Taguchi method is adapted to identify the optimum operating condition of the proposed memetic algorithm to improve the results. In this study, four factors namely, population size, crossover rate, local search iteration and a number of iteration are considered. Analyzing the parameters and improvement in results are the outlook of this research.Keywords: integrated logistics network, flexible path, memetic algorithm, Taguchi method
Procedia PDF Downloads 18721469 Vibro-Acoustic Modulation for Crack Detection in Windmill Blades
Authors: Abdullah Alnutayfat, Alexander Sutin
Abstract:
One of the most important types of renewable energy resources is wind energy which can be produced by wind turbines. The blades of the wind turbine are exposed to the pressure of the harsh environment, which causes a significant issue for the wind power industry in terms of the maintenance cost and failure of blades. One of the reliable methods for blade inspection is the vibroacoustic structural health monitoring (SHM) method which examines information obtained from the structural vibrations of the blade. However, all vibroacoustic SHM techniques are based on comparing the structural vibration of intact and damaged structures, which places a practical limit on their use. Methods for nonlinear vibroacoustic SHM are more sensitive to damage and cracking and do not need to be compared to data from the intact structure. This paper presents the Vibro-Acoustic Modulation (VAM) method based on the modulation of high-frequency (probe wave) by low-frequency loads (pump wave) produced by the blade rotation. The blade rotation alternates bending stress due to gravity, leading to crack size variations and variations in the blade resonance frequency. This method can be used with the classical SHM vibration method in which the blade is excited by piezoceramic actuator patches bonded to the blade and receives the vibration response from another piezoceramic sensor. The VAM modification of this method analyzes the spectra of the detected signal and their sideband components. We suggest the VAM model as the simple mechanical oscillator, where the parameters of the oscillator (resonance frequency and damping) are varied due to low-frequency blade rotation. This model uses the blade vibration parameters and crack influence on the blade resonance properties from previous research papers to predict the modulation index (MI).Keywords: wind turbine blades, damaged detection, vibro-acoustic structural health monitoring, vibro-acoustic modulation
Procedia PDF Downloads 8521468 Static Analysis Deployment Model for Code Quality on Research and Development Projects of Software Development
Authors: Jeong-Hyun Park, Young-Sik Park, Hyo-Teag Jung
Abstract:
This paper presents static analysis deployment model for code quality on R&D Projects of SW Development. The proposed model includes the scope of R&D projects and index for static analysis of source code, operation model and execution process, environments and infrastructure system for R&D projects of SW development. There is the static analysis result of pilot project as case study based on the proposed deployment model and environment, and strategic considerations for success operation of the proposed static analysis deployment model for R&D Projects of SW Development. The proposed static analysis deployment model in this paper will be adapted and improved continuously for quality upgrade of R&D projects, and customer satisfaction of developed source codes and products.Keywords: static analysis, code quality, coding rules, automation tool
Procedia PDF Downloads 52021467 A Segmentation Method for Grayscale Images Based on the Firefly Algorithm and the Gaussian Mixture Model
Authors: Donatella Giuliani
Abstract:
In this research, we propose an unsupervised grayscale image segmentation method based on a combination of the Firefly Algorithm and the Gaussian Mixture Model. Firstly, the Firefly Algorithm has been applied in a histogram-based research of cluster means. The Firefly Algorithm is a stochastic global optimization technique, centered on the flashing characteristics of fireflies. In this context it has been performed to determine the number of clusters and the related cluster means in a histogram-based segmentation approach. Successively these means are used in the initialization step for the parameter estimation of a Gaussian Mixture Model. The parametric probability density function of a Gaussian Mixture Model is represented as a weighted sum of Gaussian component densities, whose parameters are evaluated applying the iterative Expectation-Maximization technique. The coefficients of the linear super-position of Gaussians can be thought as prior probabilities of each component. Applying the Bayes rule, the posterior probabilities of the grayscale intensities have been evaluated, therefore their maxima are used to assign each pixel to the clusters, according to their gray-level values. The proposed approach appears fairly solid and reliable when applied even to complex grayscale images. The validation has been performed by using different standard measures, more precisely: the Root Mean Square Error (RMSE), the Structural Content (SC), the Normalized Correlation Coefficient (NK) and the Davies-Bouldin (DB) index. The achieved results have strongly confirmed the robustness of this gray scale segmentation method based on a metaheuristic algorithm. Another noteworthy advantage of this methodology is due to the use of maxima of responsibilities for the pixel assignment that implies a consistent reduction of the computational costs.Keywords: clustering images, firefly algorithm, Gaussian mixture model, meta heuristic algorithm, image segmentation
Procedia PDF Downloads 21721466 The Spherical Geometric Model of Absorbed Particles: Application to the Electron Transport Study
Authors: A. Bentabet, A. Aydin, N. Fenineche
Abstract:
The mean penetration depth has a most important in the absorption transport phenomena. Analytical model of light ion backscattering coefficients from solid targets have been made by Vicanek and Urbassek. In the present work, we showed a mathematical expression (deterministic model) for Z1/2. In advantage, in the best of our knowledge, relatively only one analytical model exit for electron or positron mean penetration depth in solid targets. In this work, we have presented a simple geometric spherical model of absorbed particles based on CSDA scheme. In advantage, we have showed an analytical expression of the mean penetration depth by combination between our model and the Vicanek and Urbassek theory. For this, we have used the Relativistic Partial Wave Expansion Method (RPWEM) and the optical dielectric model to calculate the elastic cross sections and the ranges respectively. Good agreement was found with the experimental and theoretical data.Keywords: Bentabet spherical geometric model, continuous slowing down approximation, stopping powers, ranges, mean penetration depth
Procedia PDF Downloads 64121465 Expert Review on Conceptual Design Model of iTV Advertising towards Impulse Purchase
Authors: Azizah Che Omar
Abstract:
Various studies have proposed factors of impulse purchase in different advertising mediums like website, mobile, traditional retail store and traditional television. However, to the best of researchers’ knowledge, none of the impulse purchase model is dedicated towards impulse purchase tendency for interactive TV (iTV) advertising. Therefore, the proposed model conceptual design model of interactive television advertising toward impulse purchase (iTVAdIP) was developed. The focus of this study is to evaluate the conceptual design model of iTVAdIP through expert review. As a result, the finding showed that majority of expert reviews agreed that the conceptual design model iTVAdIP is applicable to the development of interactive television advertising and it will increase the effectiveness of advertising. This study also shows the conceptual design model of iTVAdIP that has been reviewed.Keywords: impulse purchase, interactive television advertising, persuasive
Procedia PDF Downloads 35521464 The Effect of Soil Binder and Gypsum to the Changes of the Expansive Soil Shear Strength Parameters
Authors: Yulia Hastuti, Ratna Dewi, Muhammad Sandi
Abstract:
Many methods of soil stabilization that can be done such as by mixing chemicals. In this research, stabilization by mixing the soil using two types of chemical admixture, those are gypsum with a variation of 5%, 10%, and 15% and Soil binder with a concentration of 20 gr / lot of water, 25 gr / lot of water, and 30 gr / lot of water aimed to determine the effect on the soil plasticity index values and comparing the value of shear strength parameters of the mixture with the original soil conditions using a Triaxial UU test. Based on research done shows that with increasing variations in the mix, then the value of plasticity index decreased, which was originally 42% (very high degree of swelling) becomes worth 11.24% (lower Swelling degree) when a mixture of gypsum 15% and 30 gr / Lt water soil binder. As for the value shear, strength parameters increased in all variations of mixture. Admixture with the highest shear strength parameter's value is at 15% the mixture of gypsum and 20 gr / litre of water of soil binder with the 14 day treatment period, which has enhanced the cohesion value of 559.01%, the friction angle by 1157.14%. And a shear strength value of 568.49%. It can be concluded that the admixture of gypsum and soil binder correctly, can increase the value of shear strength parameters significantly and decrease the value of plasticity index of the soil.Keywords: expansive soil, gypsum, soil binder, shear strength
Procedia PDF Downloads 47521463 Coupled Analysis for Hazard Modelling of Debris Flow Due to Extreme Rainfall
Authors: N. V. Nikhil, S. R. Lee, Do Won Park
Abstract:
Korean peninsula receives about two third of the annual rainfall during summer season. The extreme rainfall pattern due to typhoon and heavy rainfall results in severe mountain disasters among which 55% of them are debris flows, a major natural hazard especially when occurring around major settlement areas. The basic mechanism underlined for this kind of failure is the unsaturated shallow slope failure by reduction of matric suction due to infiltration of water and liquefaction of the failed mass due to generation of positive pore water pressure leading to abrupt loss of strength and commencement of flow. However only an empirical model cannot simulate this complex mechanism. Hence, we have employed an empirical-physical based approach for hazard analysis of debris flow using TRIGRS, a debris flow initiation criteria and DAN3D in mountain Woonmyun, South Korea. Debris flow initiation criteria is required to discern the potential landslides which can transform into debris flow. DAN-3D, being a new model, does not have the calibrated values of rheology parameters for Korean conditions. Thus, in our analysis we have used the recent 2011 debris flow event in mountain Woonmyun san for calibration of both TRIGRS model and DAN-3D, thereafter identifying and predicting the debris flow initiation points, path, run out velocity, and area of spreading for future extreme rainfall based scenarios.Keywords: debris flow, DAN-3D, extreme rainfall, hazard analysis
Procedia PDF Downloads 24721462 Numerical Investigation into Capture Efficiency of Fibrous Filters
Authors: Jayotpaul Chaudhuri, Lutz Goedeke, Torsten Hallenga, Peter Ehrhard
Abstract:
Purification of gases from aerosols or airborne particles via filters is widely applied in the industry and in our daily lives. This separation especially in the micron and submicron size range is a necessary step to protect the environment and human health. Fibrous filters are often employed due to their low cost and high efficiency. For designing any filter the two most important performance parameters are capture efficiency and pressure drop. Since the capture efficiency is directly proportional to the pressure drop which leads to higher operating costs, a detailed investigation of the separation mechanism is required to optimize the filter designing, i.e., to have a high capture efficiency with a lower pressure drop. Therefore a two-dimensional flow simulation around a single fiber using Ansys CFX and Matlab is used to get insight into the separation process. Instead of simulating a solid fiber, the present Ansys CFX model uses a fictitious domain approach for the fiber by implementing a momentum loss model. This approach has been chosen to avoid creating a new mesh for different fiber sizes, thereby saving time and effort for re-meshing. In a first step, only the flow of the continuous fluid around the fiber is simulated in Ansys CFX and the flow field data is extracted and imported into Matlab and the particle trajectory is calculated in a Matlab routine. This calculation is a Lagrangian, one way coupled approach for particles with all relevant forces acting on it. The key parameters for the simulation in both Ansys CFX and Matlab are the porosity ε, the diameter ratio of particle and fiber D, the fluid Reynolds number Re, the Reynolds particle number Rep, the Stokes number St, the Froude number Fr and the density ratio of fluid and particle ρf/ρp. The simulation results were then compared to the single fiber theory from the literature.Keywords: BBO-equation, capture efficiency, CFX, Matlab, fibrous filter, particle trajectory
Procedia PDF Downloads 20621461 Comparative Study of Seismic Isolation as Retrofit Method for Historical Constructions
Authors: Carlos H. Cuadra
Abstract:
Seismic isolation can be used as a retrofit method for historical buildings with the advantage that minimum intervention on super-structure is required. However, selection of isolation devices depends on weight and stiffness of upper structure. In this study, two buildings are considered for analyses to evaluate the applicability of this retrofitting methodology. Both buildings are located at Akita prefecture in the north part of Japan. One building is a wooden structure that corresponds to the old council meeting hall of Noshiro city. The second building is a brick masonry structure that was used as house of a foreign mining engineer and it is located at Ani town. Ambient vibration measurements were performed on both buildings to estimate their dynamic characteristics. Then, target period of vibration of isolated systems is selected as 3 seconds is selected to estimate required stiffness of isolation devices. For wooden structure, which is a light construction, it was found that natural rubber isolators in combination with friction bearings are suitable for seismic isolation. In case of masonry building elastomeric isolator can be used for its seismic isolation. Lumped mass systems are used for seismic response analysis and it is verified in both cases that seismic isolation can be used as retrofitting method of historical construction. However, in the case of the light building, most of the weight corresponds to the reinforced concrete slab that is required to install isolation devices.Keywords: historical building, finite element method, masonry structure, seismic isolation, wooden structure
Procedia PDF Downloads 15521460 Presenting the Mathematical Model to Determine Retention in the Watersheds
Authors: S. Shamohammadi, L. Razavi
Abstract:
This paper based on the principle concepts of SCS-CN model, a new mathematical model for computation of retention potential (S) presented. In the mathematical model, not only precipitation-runoff concepts in SCS-CN model are precisely represented in a mathematical form, but also new concepts, called “maximum retention” and “total retention” is introduced, and concepts of potential retention capacity, maximum retention, and total retention have been separated from each other. In the proposed model, actual retention (F), maximum actual retention (Fmax), total retention (S), maximum retention (Smax), and potential retention (Sp), for the first time clearly defined, so that Sp is not variable, but a function of morphological characteristics of the watershed. Indeed, based on the mathematical relation of the conceptual curve of SCS-CN model, the proposed model provides a new method for the computation of actual retention in watershed and it simply determined runoff based on. In the corresponding relations, in addition to Precipitation (P), Initial retention (Ia), cumulative values of actual retention capacity (F), total retention (S), runoff (Q), antecedent moisture (M), potential retention (Sp), total retention (S), we introduced Fmax and Fmin referring to maximum and minimum actual retention, respectively. As well as, ksh is a coefficient which depends on morphological characteristics of the watershed. Advantages of the modified version versus the original model include a better precision, higher performance, easier calibration and speed computing.Keywords: model, mathematical, retention, watershed, SCS
Procedia PDF Downloads 45721459 Continuous Fixed Bed Reactor Application for Decolourization of Textile Effluent by Adsorption on NaOH Treated Eggshell
Authors: M. Chafi, S. Akazdam, C. Asrir, L. Sebbahi, B. Gourich, N. Barka, M. Essahli
Abstract:
Fixed bed adsorption has become a frequently used industrial application in wastewater treatment processes. Various low cost adsorbents have been studied for their applicability in treatment of different types of effluents. In this work, the intention of the study was to explore the efficacy and feasibility for azo dye, Acid Orange 7 (AO7) adsorption onto fixed bed column of NaOH Treated eggshell (TES). The effect of various parameters like flow rate, initial dye concentration, and bed height were exploited in this study. The studies confirmed that the breakthrough curves were dependent on flow rate, initial dye concentration solution of AO7 and bed depth. The Thomas, Yoon–Nelson, and Adams and Bohart models were analysed to evaluate the column adsorption performance. The adsorption capacity, rate constant and correlation coefficient associated to each model for column adsorption was calculated and mentioned. The column experimental data were fitted well with Thomas model with coefficients of correlation R2 ≥0.93 at different conditions but the Yoon–Nelson, BDST and Bohart–Adams model (R2=0.911), predicted poor performance of fixed-bed column. The (TES) was shown to be suitable adsorbent for adsorption of AO7 using fixed-bed adsorption column.Keywords: adsorption models, acid orange 7, bed depth, breakthrough, dye adsorption, fixed-bed column, treated eggshell
Procedia PDF Downloads 37721458 Salinity Stress: Effects on Growth Biochemical Parameters and Ion Homeostasis in Spinach (Spinacia Oleracea L.)
Authors: Umar Jaafar, Mungadi
Abstract:
Plant growth, biochemical parameters, cytotoxic ion sequestration and ionic in balance were determined for spinach in response to varied concentrations of NaCl. The plant show decline in all vegetative parameters measured. Free proline content increase with increasing salt concentration and differ significantly (p<0.05) while the glycine betaine insignificantly (p>0.05) affected by concentration of NaCl. Salinity increases the cytotoxic ions, sodium chlorine ion and calcium with corresponding decrease in potassium ion concentrations. The ionic balance (Na+/K+) is low due to high content of potassium ion in plant accumulation ranging from 7700 to 6500 mg/kg. It can be concluded that the osmolyte accumulations, high number of leaves are possible indicators of salt tolerance in the spinach.Keywords: spinach, salinity, osmolyte, cytotoxic
Procedia PDF Downloads 35821457 A Study of Population Growth Models and Future Population of India
Authors: Sheena K. J., Jyoti Badge, Sayed Mohammed Zeeshan
Abstract:
A Comparative Study of Exponential and Logistic Population Growth Models in India India is the second most populous city in the world, just behind China, and is going to be in the first place by next year. The Indian population has remarkably at higher rate than the other countries from the past 20 years. There were many scientists and demographers who has formulated various models of population growth in order to study and predict the future population. Some of the models are Fibonacci population growth model, Exponential growth model, Logistic growth model, Lotka-Volterra model, etc. These models have been effective in the past to an extent in predicting the population. However, it is essential to have a detailed comparative study between the population models to come out with a more accurate one. Having said that, this research study helps to analyze and compare the two population models under consideration - exponential and logistic growth models, thereby identifying the most effective one. Using the census data of 2011, the approximate population for 2016 to 2031 are calculated for 20 Indian states using both the models, compared and recorded the data with the actual population. On comparing the results of both models, it is found that logistic population model is more accurate than the exponential model, and using this model, we can predict the future population in a more effective way. This will give an insight to the researchers about the effective models of population and how effective these population models are in predicting the future population.Keywords: population growth, population models, exponential model, logistic model, fibonacci model, lotka-volterra model, future population prediction, demographers
Procedia PDF Downloads 12421456 Polynomial Chaos Expansion Combined with Exponential Spline for Singularly Perturbed Boundary Value Problems with Random Parameter
Authors: W. K. Zahra, M. A. El-Beltagy, R. R. Elkhadrawy
Abstract:
So many practical problems in science and technology developed over the past decays. For instance, the mathematical boundary layer theory or the approximation of solution for different problems described by differential equations. When such problems consider large or small parameters, they become increasingly complex and therefore require the use of asymptotic methods. In this work, we consider the singularly perturbed boundary value problems which contain very small parameters. Moreover, we will consider these perturbation parameters as random variables. We propose a numerical method to solve this kind of problems. The proposed method is based on an exponential spline, Shishkin mesh discretization, and polynomial chaos expansion. The polynomial chaos expansion is used to handle the randomness exist in the perturbation parameter. Furthermore, the Monte Carlo Simulations (MCS) are used to validate the solution and the accuracy of the proposed method. Numerical results are provided to show the applicability and efficiency of the proposed method, which maintains a very remarkable high accuracy and it is ε-uniform convergence of almost second order.Keywords: singular perturbation problem, polynomial chaos expansion, Shishkin mesh, two small parameters, exponential spline
Procedia PDF Downloads 16021455 Analysis of the Statistical Characterization of Significant Wave Data Exceedances for Designing Offshore Structures
Authors: Rui Teixeira, Alan O’Connor, Maria Nogal
Abstract:
The statistical theory of extreme events is progressively a topic of growing interest in all the fields of science and engineering. The changes currently experienced by the world, economic and environmental, emphasized the importance of dealing with extreme occurrences with improved accuracy. When it comes to the design of offshore structures, particularly offshore wind turbines, the importance of efficiently characterizing extreme events is of major relevance. Extreme events are commonly characterized by extreme values theory. As an alternative, the accurate modeling of the tails of statistical distributions and the characterization of the low occurrence events can be achieved with the application of the Peak-Over-Threshold (POT) methodology. The POT methodology allows for a more refined fit of the statistical distribution by truncating the data with a minimum value of a predefined threshold u. For mathematically approximating the tail of the empirical statistical distribution the Generalised Pareto is widely used. Although, in the case of the exceedances of significant wave data (H_s) the 2 parameters Weibull and the Exponential distribution, which is a specific case of the Generalised Pareto distribution, are frequently used as an alternative. The Generalized Pareto, despite the existence of practical cases where it is applied, is not completely recognized as the adequate solution to model exceedances over a certain threshold u. References that set the Generalised Pareto distribution as a secondary solution in the case of significant wave data can be identified in the literature. In this framework, the current study intends to tackle the discussion of the application of statistical models to characterize exceedances of wave data. Comparison of the application of the Generalised Pareto, the 2 parameters Weibull and the Exponential distribution are presented for different values of the threshold u. Real wave data obtained in four buoys along the Irish coast was used in the comparative analysis. Results show that the application of the statistical distributions to characterize significant wave data needs to be addressed carefully and in each particular case one of the statistical models mentioned fits better the data than the others. Depending on the value of the threshold u different results are obtained. Other variables of the fit, as the number of points and the estimation of the model parameters, are analyzed and the respective conclusions were drawn. Some guidelines on the application of the POT method are presented. Modeling the tail of the distributions shows to be, for the present case, a highly non-linear task and, due to its growing importance, should be addressed carefully for an efficient estimation of very low occurrence events.Keywords: extreme events, offshore structures, peak-over-threshold, significant wave data
Procedia PDF Downloads 27221454 A Mean–Variance–Skewness Portfolio Optimization Model
Authors: Kostas Metaxiotis
Abstract:
Portfolio optimization is one of the most important topics in finance. This paper proposes a mean–variance–skewness (MVS) portfolio optimization model. Traditionally, the portfolio optimization problem is solved by using the mean–variance (MV) framework. In this study, we formulate the proposed model as a three-objective optimization problem, where the portfolio's expected return and skewness are maximized whereas the portfolio risk is minimized. For solving the proposed three-objective portfolio optimization model we apply an adapted version of the non-dominated sorting genetic algorithm (NSGAII). Finally, we use a real dataset from FTSE-100 for validating the proposed model.Keywords: evolutionary algorithms, portfolio optimization, skewness, stock selection
Procedia PDF Downloads 19821453 Developing a Total Quality Management Model Using Structural Equation Modeling for Indonesian Healthcare Industry
Authors: Jonny, T. Yuri M. Zagloel
Abstract:
This paper is made to present an Indonesian Healthcare model. Currently, there are nine TQM (Total Quality Management) practices in healthcare industry. However, these practices are not integrated yet. Therefore, this paper aims to integrate these practices as a model by using Structural Equation Modeling (SEM). After administering about 210 questionnaires to various stakeholders of this industry, a LISREL program was used to evaluate the model's fitness. The result confirmed that the model is fit because the p-value was about 0.45 or above required 0.05. This has signified that previously mentioned of nine TQM practices are able to be integrated as an Indonesian healthcare model.Keywords: healthcare, total quality management (TQM), structural equation modeling (SEM), linear structural relations (LISREL)
Procedia PDF Downloads 29221452 Enhanced CNN for Rice Leaf Disease Classification in Mobile Applications
Authors: Kayne Uriel K. Rodrigo, Jerriane Hillary Heart S. Marcial, Samuel C. Brillo
Abstract:
Rice leaf diseases significantly impact yield production in rice-dependent countries, affecting their agricultural sectors. As part of precision agriculture, early and accurate detection of these diseases is crucial for effective mitigation practices and minimizing crop losses. Hence, this study proposes an enhancement to the Convolutional Neural Network (CNN), a widely-used method for Rice Leaf Disease Image Classification, by incorporating MobileViTV2—a recently advanced architecture that combines CNN and Vision Transformer models while maintaining fewer parameters, making it suitable for broader deployment on edge devices. Our methodology utilizes a publicly available rice disease image dataset from Kaggle, which was validated by a university structural biologist following the guidelines provided by the Philippine Rice Institute (PhilRice). Modifications to the dataset include renaming certain disease categories and augmenting the rice leaf image data through rotation, scaling, and flipping. The enhanced dataset was then used to train the MobileViTV2 model using the Timm library. The results of our approach are as follows: the model achieved notable performance, with 98% accuracy in both training and validation, 6% training and validation loss, and a Receiver Operating Characteristic (ROC) curve ranging from 95% to 100% for each label. Additionally, the F1 score was 97%. These metrics demonstrate a significant improvement compared to a conventional CNN-based approach, which, in a previous 2022 study, achieved only 78% accuracy after using 5 convolutional layers and 2 dense layers. Thus, it can be concluded that MobileViTV2, with its fewer parameters, outperforms traditional CNN models, particularly when applied to Rice Leaf Disease Image Identification. For future work, we recommend extending this model to include datasets validated by international rice experts and broadening the scope to accommodate biotic factors such as rice pest classification, as well as abiotic stressors such as climate, soil quality, and geographic information, which could improve the accuracy of disease prediction.Keywords: convolutional neural network, MobileViTV2, rice leaf disease, precision agriculture, image classification, vision transformer
Procedia PDF Downloads 2221451 Equilibrium Modeling of a Two Stage Downdraft Gasifier Using Different Gasification Fluids
Authors: F. R. M. Nascimento, E. E. S. Lora, J. C. E. Palácio
Abstract:
A mathematical model to investigate the performance of a two stage fixed bed downdraft gasifier operating with air, steam and oxygen mixtures as the gasifying fluid has been developed. The various conditions of mixtures for a double stage fluid entry, have been performed. The model has been validated through a series of experimental tests performed by NEST – The Excellence Group in Thermal and Distributed Generation of the Federal University of Itajubá. Influence of mixtures are analyzed through the Steam to Biomass (SB), Equivalence Ratio (ER) and the Oxygen Concentration (OP) parameters in order to predict the best operating conditions to obtain adequate output gas quality, once is a key parameter for subsequent gas processing in the synthesis of biofuels, heat and electricity generation. Results show that there is an optimal combination in the steam and oxygen content of the gasifying fluid which allows the user find the best conditions to design and operate the equipment according to the desired application.Keywords: air, equilibrium, downdraft, fixed bed gasification, mathematical modeling, mixtures, oxygen steam
Procedia PDF Downloads 48121450 Influence of Densification Process and Material Properties on Final Briquettes Quality from FastGrowing Willows
Authors: Peter Križan, Juraj Beniak, Ľubomír Šooš, Miloš Matúš
Abstract:
Biomass treatment through densification is very suitable and important technology before its effective energy recovery. Densification process of biomass is significantly influenced by various technological and also material parameters which are ultimately reflected on the final solid Biofuels quality. The paper deals with the experimental research of the relationship between technological and material parameters during densification of fast-growing trees, roundly fast-rowing willow. The main goal of presented experimental research is to determine the relationship between pressing pressure raw material fraction size from a final briquettes density point of view. Experimental research was realized by single-axis densification. The impact of fraction size with interaction of pressing pressure and stabilization time on the quality properties of briquettes was determined. These parameters interaction affects the final solid biofuels (briquettes) quality. From briquettes production point of view and also from densification machines constructions point of view is very important to know about mutual interaction of these parameters on final briquettes quality. The experimental findings presented here are showing the importance of mentioned parameters during the densification process.Keywords: briquettes density, densification, fraction size, pressing pressure, stabilization time
Procedia PDF Downloads 36821449 A Research on Flipped-Classroom Teaching Model in English for Academic Purpose Teaching
Authors: Li Shuang
Abstract:
With rigid teaching procedures and limited academic performance assessment methods, traditional teaching model stands in the way of college English reform in China, which features EAP (English for Academic Purpose) teaching. Flipped-classroom teaching, which has been extensively applied to science subjects teaching, however, covers the shortage of traditional teaching model in EAP teaching, via creatively inverting traditional teaching procedures. Besides, the application of flipped-classroom teaching model in EAP teaching also proves that this new teaching philosophy is not confined to science subjects teaching; it goes perfectly well with liberal-arts subjects teaching. Data analysis, desk research survey, and comparative study are referred to in the essay so as to prove its feasibility and advantages in EAP teaching.Keywords: EAP, traditional teaching method, flipped-classroom teaching model, teaching model design
Procedia PDF Downloads 31121448 Bi-Criteria Objective Network Design Model for Multi Period Multi Product Green Supply Chain
Authors: Shahul Hamid Khan, S. Santhosh, Abhinav Kumar Sharma
Abstract:
Environmental performance along with social performance is becoming vital factors for industries to achieve global standards. With a good environmental policy global industries are differentiating them from their competitors. This paper concentrates on multi stage, multi product and multi period manufacturing network. Bi-objective mathematical models for total cost and total emission for the entire forward supply chain are considered. Here five different problems are considered by varying the number of suppliers, manufacturers, and environmental levels, for illustrating the taken mathematical model. GA, and Random search are used for finding the optimal solution. The input parameters of the optimal solution are used to find the tradeoff between the initial investment by the industry and the long term benefit of the environment.Keywords: closed loop supply chain, genetic algorithm, random search, green supply chain
Procedia PDF Downloads 549