Search results for: model prediction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7865

Search results for: model prediction

4145 Tuned Mass Damper Effects of Stationary People on Structural Damping of Footbridge Due to Dynamic Interaction in Vertical Motion

Authors: M. Yoneda

Abstract:

It is known that stationary human occupants act as dynamic mass-spring-damper systems and can change the modal properties of civil engineering structures. This paper describes the full scale measurement to explain the tuned mass damper effects of stationary people on structural damping of footbridge with center span length of 33 m. A human body can be represented by a lumped system consisting of masses, springs, and dashpots. Complex eigenvalue calculation is also conducted by using ISO5982:1981 human model (two degree of freedom system). Based on experimental and analytical results for the footbridge with the stationary people in the standing position, it is demonstrated that stationary people behave as a tuned mass damper and that ISO5982:1981 human model can explain the structural damping characteristics measured in the field.

Keywords: Dynamic interaction, footbridge, stationary people, structural damping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1108
4144 Modeling of Reinforcement in Concrete Beams Using Machine Learning Tools

Authors: Yogesh Aggarwal

Abstract:

The paper discusses the results obtained to predict reinforcement in singly reinforced beam using Neural Net (NN), Support Vector Machines (SVM-s) and Tree Based Models. Major advantage of SVM-s over NN is of minimizing a bound on the generalization error of model rather than minimizing a bound on mean square error over the data set as done in NN. Tree Based approach divides the problem into a small number of sub problems to reach at a conclusion. Number of data was created for different parameters of beam to calculate the reinforcement using limit state method for creation of models and validation. The results from this study suggest a remarkably good performance of tree based and SVM-s models. Further, this study found that these two techniques work well and even better than Neural Network methods. A comparison of predicted values with actual values suggests a very good correlation coefficient with all four techniques.

Keywords: Linear Regression, M5 Model Tree, Neural Network, Support Vector Machines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2023
4143 Finite Volume Method for Flow Prediction Using Unstructured Meshes

Authors: Juhee Lee, Yongjun Lee

Abstract:

In designing a low-energy-consuming buildings, the heat transfer through a large glass or wall becomes critical. Multiple layers of the window glasses and walls are employed for the high insulation. The gravity driven air flow between window glasses or wall layers is a natural heat convection phenomenon being a key of the heat transfer. For the first step of the natural heat transfer analysis, in this study the development and application of a finite volume method for the numerical computation of viscous incompressible flows is presented. It will become a part of the natural convection analysis with high-order scheme, multi-grid method, and dual-time step in the future. A finite volume method based on a fully-implicit second-order is used to discretize and solve the fluid flow on unstructured grids composed of arbitrary-shaped cells. The integrations of the governing equation are discretised in the finite volume manner using a collocated arrangement of variables. The convergence of the SIMPLE segregated algorithm for the solution of the coupled nonlinear algebraic equations is accelerated by using a sparse matrix solver such as BiCGSTAB. The method used in the present study is verified by applying it to some flows for which either the numerical solution is known or the solution can be obtained using another numerical technique available in the other researches. The accuracy of the method is assessed through the grid refinement.

Keywords: Finite volume method, fluid flow, laminar flow, unstructured grid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1832
4142 A Formative Assessment Model within the Competency-Based-Approach for an Individualized E-learning Path

Authors: El Falaki Brahim, Khalidi Idrissi Mohammed, Bennani Samir

Abstract:

E-learning is not restricted to the use of new technologies for the online content, but also induces the adoption of new approaches to improve the quality of education. This quality depends on the ability of these approaches (technical and pedagogical) to provide an adaptive learning environment. Thus, the environment should include features that convey intentions and meeting the educational needs of learners by providing a customized learning path to acquiring a competency concerned In our proposal, we believe that an individualized learning path requires knowledge of the learner. Therefore, it must pass through a personalization of diagnosis to identify precisely the competency gaps to fill, and reduce the cognitive load To personalize the diagnosis and pertinently measure the competency gap, we suggest implementing the formative assessment in the e-learning environment and we propose the introduction of a pre-regulation process in the area of formative assessment, involving its individualization and implementation in e-learning.

Keywords: Competency-Based-Approach, E-learning, Formative assessment, learner model, Modeling, pre-regulation process

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2104
4141 Implicit Eulerian Fluid-Structure Interaction Method for the Modeling of Highly Deformable Elastic Membranes

Authors: Aymen Laadhari, Gábor Székely

Abstract:

This paper is concerned with the development of a fully implicit and purely Eulerian fluid-structure interaction method tailored for the modeling of the large deformations of elastic membranes in a surrounding Newtonian fluid. We consider a simplified model for the mechanical properties of the membrane, in which the surface strain energy depends on the membrane stretching. The fully Eulerian description is based on the advection of a modified surface tension tensor, and the deformations of the membrane are tracked using a level set strategy. The resulting nonlinear problem is solved by a Newton-Raphson method, featuring a quadratic convergence behavior. A monolithic solver is implemented, and we report several numerical experiments aimed at model validation and illustrating the accuracy of the presented method. We show that stability is maintained for significantly larger time steps.

Keywords: Fluid-membrane interaction, stretching, Eulerian, finite element method, Newton, implicit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1279
4140 Estimation Model for Concrete Slump Recovery by Using Superplasticizer

Authors: Chaiyakrit Raoupatham, Ram Hari Dhakal, Chalermchai Wanichlamlert

Abstract:

This paper aimed to introduce the solution of concrete slump recovery using chemical admixture type-F (superplasticizer, naphthalene base) to the practice in order to solve unusable concrete problem due to concrete loss its slump, especially for those tropical countries that have faster slump loss rate. In the other hand, randomly adding superplasticizer into concrete can cause concrete to segregate. Therefore, this paper also develops the estimation model used to calculate amount of second dose of superplasticizer need for concrete slump recovery. Fresh properties of ordinary Portland cement concrete with volumetric ratio of paste to void between aggregate (paste content) of 1.1-1.3 with water-cement ratio zone of 0.30 to 0.67 and initial superplasticizer (naphthalene base) of 0.25%-1.6% were tested for initial slump and slump loss for every 30 minutes for one and half hour by slump cone test. Those concretes with slump loss range from 10% to 90% were re-dosed and successfully recovered back to its initial slump. Slump after re-dosed was tested by slump cone test. From the result, it has been concluded that, slump loss was slower for those mix with high initial dose of superplasticizer due to addition of superplasticizer will disturb cement hydration. The required second dose of superplasticizer was affected by two major parameters, which were water-cement ratio and paste content, where lower water-cement ratio and paste content cause an increase in require second dose of superplasticizer. The amount of second dose of superplasticizer is higher as the solid content within the system is increase, solid can be either from cement particles or aggregate. The data was analyzed to form an equation use to estimate the amount of second dosage requirement of superplasticizer to recovery slump to its original.

Keywords: Estimation model, second superplasticizer dosage, slump loss, slump recovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1906
4139 Processes Simulation Study of Coal to Methanol Based on Gasification Technology

Authors: Po-Chuang Chen, Hsiu-Mei Chiu, Yau-Pin Chyou, Chiou-Shia Yu

Abstract:

This study presents a simulation model for converting coal to methanol, based on gasification technology with the commercial chemical process simulator, Pro/II® V8.1.1. The methanol plant consists of air separation unit (ASU), gasification unit, gas clean-up unit, and methanol synthetic unit. The clean syngas is produced with the first three operating units, and the model has been verified with the reference data from United States Environment Protection Agency. The liquid phase methanol (LPMEOHTM) process is adopted in the methanol synthetic unit. Clean syngas goes through gas handing section to reach the reaction requirement, reactor loop/catalyst to generate methanol, and methanol distillation to get desired purity over 99.9 wt%. The ratio of the total energy combined with methanol and dimethyl ether to that of feed coal is 78.5% (gross efficiency). The net efficiency is 64.2% with the internal power consumption taken into account, based on the assumption that the efficiency of electricity generation is 40%.

Keywords: Gasification, Methanol, LPMEOH, System-levelsimulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5347
4138 The Use of Artificial Neural Network in Option Pricing: The Case of S and P 100 Index Options

Authors: Zeynep İltüzer Samur, Gül Tekin Temur

Abstract:

Due to the increasing and varying risks that economic units face with, derivative instruments gain substantial importance, and trading volumes of derivatives have reached very significant level. Parallel with these high trading volumes, researchers have developed many different models. Some are parametric, some are nonparametric. In this study, the aim is to analyse the success of artificial neural network in pricing of options with S&P 100 index options data. Generally, the previous studies cover the data of European type call options. This study includes not only European call option but also American call and put options and European put options. Three data sets are used to perform three different ANN models. One only includes data that are directly observed from the economic environment, i.e. strike price, spot price, interest rate, maturity, type of the contract. The others include an extra input that is not an observable data but a parameter, i.e. volatility. With these detail data, the performance of ANN in put/call dimension, American/European dimension, moneyness dimension is analyzed and whether the contribution of the volatility in neural network analysis make improvement in prediction performance or not is examined. The most striking results revealed by the study is that ANN shows better performance when pricing call options compared to put options; and the use of volatility parameter as an input does not improve the performance.

Keywords: Option Pricing, Neural Network, S&P 100 Index, American/European options

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3070
4137 A Semi-Implicit Phase Field Model for Droplet Evolution

Authors: M. H. Kazemi, D. Salac

Abstract:

A semi-implicit phase field method for droplet evolution is proposed. Using the phase field Cahn-Hilliard equation, we are able to track the interface in multiphase flow. The idea of a semi-implicit finite difference scheme is reviewed and employed to solve two nonlinear equations, including the Navier-Stokes and the Cahn-Hilliard equations. The use of a semi-implicit method allows us to have larger time steps compared to explicit schemes. The governing equations are coupled and then solved by a GMRES solver (generalized minimal residual method) using modified Gram-Schmidt orthogonalization. To show the validity of the method, we apply the method to the simulation of a rising droplet, a leaky dielectric drop and the coalescence of drops. The numerical solutions to the phase field model match well with existing solutions over a defined range of variables.

Keywords: Coalescence, leaky dielectric, numerical method, phase field, rising droplet, semi-implicit method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 866
4136 The Impact of Digital Inclusive Finance on the High-Quality Development of China's Export Trade

Authors: Yao Wu

Abstract:

In the context of financial globalization, China has put forward the policy goal of high-quality development, and the digital economy, with its advantage of information resources, is driving China's export trade to achieve high-quality development. Due to the long-standing financing constraints of small and medium-sized export enterprises, how to expand the export scale of small and medium-sized enterprises has become a major threshold for the development of China's export trade. This paper firstly adopts the hierarchical analysis method to establish the evaluation system of high-quality development of China's export trade; secondly, the panel data of 30 provinces in China from 2011 to 2018 are selected for empirical analysis to establish the impact model of digital inclusive finance on the high-quality development of China's export trade; based on the analysis of the heterogeneous enterprise trade model, a mediating effect model is established to verify the mediating role of credit constraint in the development of high-quality export trade in China. Based on the above analysis, this paper concludes that inclusive digital finance, with its unique digital and inclusive nature, alleviates the credit constraint problem among SMEs, enhances the binary marginal effect of SMEs' exports, optimizes their export scale and structure, and promotes the high-quality development of regional and even national export trade. Finally, based on the findings of this paper, we propose insights and suggestions for inclusive digital finance to promote the high-quality development of export trade.

Keywords: Digital inclusive finance, high-quality development of export trade, fixed effects, binary marginal effects.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 666
4135 Modeling of Compaction Curves for Corn Cob Ash-Cement Stabilized Lateritic Soils

Authors: O. A. Apampa, Y. A. Jimoh, K. A. Olonade

Abstract:

The need to save time and cost of soil testing at the planning stage of road work has necessitated developing predictive models. This study proposes a model for predicting the dry density of lateritic soils stabilized with corn cob ash (CCA) and blended cement - CCA. Lateritic soil was first stabilized with CCA at 1.5, 3.0, 4.5 and 6% of the weight of soil and then stabilized with the same proportions as replacement for cement. Dry density, specific gravity, maximum degree of saturation and moisture content were determined for each stabilized soil specimen, following standard procedure. Polynomial equations containing alpha and beta parameters for CCA and blended CCA-cement were developed. Experimental values were correlated with the values predicted from the Matlab curve fitting tool, and the Solver function of Microsoft Excel 2010. The correlation coefficient (R2) of 0.86 was obtained indicating that the model could be accepted in predicting the maximum dry density of CCA stabilized soils to facilitate quick decision making in roadworks.

Keywords: Corn cob ash, lateritic soil, stabilization, maximum dry density, moisture content.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1692
4134 Monomial Form Approach to Rectangular Surface Modeling

Authors: Taweechai Nuntawisuttiwong, Natasha Dejdumrong

Abstract:

Geometric modeling plays an important role in the constructions and manufacturing of curve, surface and solid modeling. Their algorithms are critically important not only in the automobile, ship and aircraft manufacturing business, but are also absolutely necessary in a wide variety of modern applications, e.g., robotics, optimization, computer vision, data analytics and visualization. The calculation and display of geometric objects can be accomplished by these six techniques: Polynomial basis, Recursive, Iterative, Coefficient matrix, Polar form approach and Pyramidal algorithms. In this research, the coefficient matrix (simply called monomial form approach) will be used to model polynomial rectangular patches, i.e., Said-Ball, Wang-Ball, DP, Dejdumrong and NB1 surfaces. Some examples of the monomial forms for these surface modeling are illustrated in many aspects, e.g., construction, derivatives, model transformation, degree elevation and degress reduction.

Keywords: Monomial form, rectangular surfaces, CAGD curves, monomial matrix applications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 692
4133 The Location of Park and Ride Facilities Using the Fuzzy Inference Model

Authors: Anna Lower, Michal Lower, Robert Masztalski, Agnieszka Szumilas

Abstract:

The paper presents a method in which the expert knowledge is applied to fuzzy inference model. Even a less experienced person could benefit from the use of such a system, e.g. urban planners, officials. The analysis result is obtained in a very short time, so a large number of the proposed locations can also be verified in a short time. The proposed method is intended for testing of locations of car parks in a city. The paper shows selected examples of locations of the P&R facilities in cities planning to introduce the P&R. The analyses of existing objects are also shown in the paper and they are confronted with the opinions of the system users, with particular emphasis on unpopular locations. The results of the analyses are compared to expert analysis of the P&R facilities location that was outsourced by the city and the opinions about existing facilities users that were expressed on social networking sites. The obtained results are consistent with actual users’ feedback. The proposed method proves to be good, but does not require the involvement of a large experts team and large financial contributions for complicated research. The method also provides an opportunity to show the alternative location of P&R facilities. Although the results of the method are approximate, they are not worse than results of analysis of employed experts. The advantage of this method is ease of use, which simplifies the professional expert analysis. The ability of analyzing a large number of alternative locations gives a broader view on the problem. It is valuable that the arduous analysis of the team of people can be replaced by the model's calculation. According to the authors, the proposed method is also suitable for implementation on a GIS platform.

Keywords: Fuzzy logic inference, P&R facilities, P&R location.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
4132 Investigating the Demand for Short-shelf Life Food Products for SME Wholesalers

Authors: Yamini Raju, Parminder S. Kang, Adam Moroz, Ross Clement, Ashley Hopwell, Alistair Duffy

Abstract:

Accurate forecasting of fresh produce demand is one the challenges faced by Small Medium Enterprise (SME) wholesalers. This paper is an attempt to understand the cause for the high level of variability such as weather, holidays etc., in demand of SME wholesalers. Therefore, understanding the significance of unidentified factors may improve the forecasting accuracy. This paper presents the current literature on the factors used to predict demand and the existing forecasting techniques of short shelf life products. It then investigates a variety of internal and external possible factors, some of which is not used by other researchers in the demand prediction process. The results presented in this paper are further analysed using a number of techniques to minimize noise in the data. For the analysis past sales data (January 2009 to May 2014) from a UK based SME wholesaler is used and the results presented are limited to product ‘Milk’ focused on café’s in derby. The correlation analysis is done to check the dependencies of variability factor on the actual demand. Further PCA analysis is done to understand the significance of factors identified using correlation. The PCA results suggest that the cloud cover, weather summary and temperature are the most significant factors that can be used in forecasting the demand. The correlation of the above three factors increased relative to monthly and becomes more stable compared to the weekly and daily demand.

Keywords: Demand Forecasting, Deteriorating Products, Food Wholesalers, Principal Component Analysis and Variability Factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3359
4131 Fabless Prototyping Methodology for the Development of SOI based MEMS Microgripper

Authors: H. M. Usman Sani, Shafaat A. Bazaz, Nisar Ahmed

Abstract:

In this paper, Fabless Prototyping Methodology is introduced for the design and analysis of MEMS devices. Conventionally Finite Element Analysis (FEA) is performed before system level simulation. In our proposed methodology, system level simulation is performed earlier than FEA as it is computationally less extensive and low cost. System level simulations are based on equivalent behavioral models of MEMS device. Electrostatic actuation based MEMS Microgripper is chosen as case study to implement this methodology. This paper addresses the behavioral model development and simulation of actuator part of an electrostatically actuated Microgripper. Simulation results show that the actuator part of Microgripper works efficiently for a voltage range of 0-45V with the corresponding jaw displacement of 0-4.5425μm. With some minor changes in design, this range can be enhanced to 15μm at 85V.

Keywords: MEMS Actuator, Behavioral Model, CoventorWare, Microgripper, SOIMUMPs, System Level Simulation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2270
4130 Effect of Transverse Reinforcement on the Behavior of Tension Lap splice in High-Strength Reinforced Concrete Beams

Authors: Ahmed H. Abdel-Kareem, Hala. Abousafa, Omia S. El-Hadidi

Abstract:

The results of an experimental program conducted on seventeen simply supported concrete beams to study the effect of transverse reinforcement on the behavior of lap splice of steel reinforcement in tension zones in high strength concrete beams, are presented. The parameters included in the experimental program were the concrete compressive strength, the lap splice length, the amount of transverse reinforcement provided within the splice region, and the shape of transverse reinforcement around spliced bars. The experimental results showed that the displacement ductility increased and the mode of failure changed from splitting bond failure to flexural failure when the amount of transverse reinforcement in splice region increased, and the compressive strength increased up to 100 MPa. The presence of transverse reinforcement around spliced bars had pronounced effect on increasing the ultimate load, the ultimate deflection, and the displacement ductility. The prediction of maximum steel stresses for spliced bars using ACI 318-05 building code was compared with the experimental results. The comparison showed that the effect of transverse reinforcement around spliced bars has to be considered into the design equations for lap splice length in high strength concrete beams.

Keywords: Ductility, high strength concrete, tension lap splice, transverse reinforcement, steel stresses.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4703
4129 The Analysis of Defects Prediction in Injection Molding

Authors: Mehdi Moayyedian, Kazem Abhary, Romeo Marian

Abstract:

This paper presents an evaluation of a plastic defect in injection molding before it occurs in the process; it is known as the short shot defect. The evaluation of different parameters which affect the possibility of short shot defect is the aim of this paper. The analysis of short shot possibility is conducted via SolidWorks Plastics and Taguchi method to determine the most significant parameters. Finite Element Method (FEM) is employed to analyze two circular flat polypropylene plates of 1 mm thickness. Filling time, part cooling time, pressure holding time, melt temperature and gate type are chosen as process and geometric parameters, respectively. A methodology is presented herein to predict the possibility of the short-shot occurrence. The analysis determined melt temperature is the most influential parameter affecting the possibility of short shot defect with a contribution of 74.25%, and filling time with a contribution of 22%, followed by gate type with a contribution of 3.69%. It was also determined the optimum level of each parameter leading to a reduction in the possibility of short shot are gate type at level 1, filling time at level 3 and melt temperature at level 3. Finally, the most significant parameters affecting the possibility of short shot were determined to be melt temperature, filling time, and gate type.

Keywords: Injection molding, plastic defects, short shot, Taguchi method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1523
4128 A Background Subtraction Based Moving Object Detection around the Host Vehicle

Authors: Hyojin Lim, Cuong Nguyen Khac, Ho-Youl Jung

Abstract:

In this paper, we propose moving object detection method which is helpful for driver to safely take his/her car out of parking lot. When moving objects such as motorbikes, pedestrians, the other cars and some obstacles are detected at the rear-side of host vehicle, the proposed algorithm can provide to driver warning. We assume that the host vehicle is just before departure. Gaussian Mixture Model (GMM) based background subtraction is basically applied. Pre-processing such as smoothing and post-processing as morphological filtering are added. We examine “which color space has better performance for detection of moving objects?” Three color spaces including RGB, YCbCr, and Y are applied and compared, in terms of detection rate. Through simulation, we prove that RGB space is more suitable for moving object detection based on background subtraction.

Keywords: Gaussian mixture model, background subtraction, Moving object detection, color space, morphological filtering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2548
4127 A Model to Determine Atmospheric Stability and its Correlation with CO Concentration

Authors: Kh. Ashrafi, Gh. A. Hoshyaripour

Abstract:

Atmospheric stability plays the most important role in the transport and dispersion of air pollutants. Different methods are used for stability determination with varying degrees of complexity. Most of these methods are based on the relative magnitude of convective and mechanical turbulence in atmospheric motions. Richardson number, Monin-Obukhov length, Pasquill-Gifford stability classification and Pasquill–Turner stability classification, are the most common parameters and methods. The Pasquill–Turner Method (PTM), which is employed in this study, makes use of observations of wind speed, insolation and the time of day to classify atmospheric stability with distinguishable indices. In this study, a model is presented to determination of atmospheric stability conditions using PTM. As a case study, meteorological data of Mehrabad station in Tehran from 2000 to 2005 is applied to model. Here, three different categories are considered to deduce the pattern of stability conditions. First, the total pattern of stability classification is obtained and results show that atmosphere is 38.77%, 27.26%, 33.97%, at stable, neutral and unstable condition, respectively. It is also observed that days are mostly unstable (66.50%) while nights are mostly stable (72.55%). Second, monthly and seasonal patterns are derived and results indicate that relative frequency of stable conditions decrease during January to June and increase during June to December, while results for unstable conditions are exactly in opposite manner. Autumn is the most stable season with relative frequency of 50.69% for stable condition, whilst, it is 42.79%, 34.38% and 27.08% for winter, summer and spring, respectively. Hourly stability pattern is the third category that points out that unstable condition is dominant from approximately 03-15 GTM and 04-12 GTM for warm and cold seasons, respectively. Finally, correlation between atmospheric stability and CO concentration is achieved.

Keywords: Atmospheric stability, Pasquill-Turner classification, convective turbulence, mechanical turbulence, Tehran.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6441
4126 Structural Cost of Optimized Reinforced Concrete Isolated Footing

Authors: Mohammed S. Al-Ansari

Abstract:

This paper presents an analytical model to estimate the cost of an optimized design of reinforced concrete isolated footing base on structural safety. Flexural and optimized formulas for square and rectangular footingare derived base on ACI building code of design, material cost and optimization. The optimization constraints consist of upper and lower limits of depth and area of steel. Footing depth and area of reinforcing steel are to be minimized to yield the optimal footing dimensions. Optimized footing materials cost of concrete, reinforcing steel and formwork of the designed sections are computed. Total cost factor TCF and other cost factors are developed to generalize and simplify the calculations of footing material cost. Numerical examples are presented to illustrate the model capability of estimating the material cost of the footing for a desired axial load.

Keywords: Footing, Depth, Concrete, Steel, Formwork, Optimization, Material cost, Cost Factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4701
4125 Breast Cancer Survivability Prediction via Classifier Ensemble

Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia

Abstract:

This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.

Keywords: Classifier ensemble, breast cancer survivability, data mining, SEER.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1666
4124 The Benefits of End-To-End Integrated Planning from the Mine to Client Supply for Minimizing Penalties

Authors: G. Martino, F. Silva, E. Marchal

Abstract:

The control over delivered iron ore blend characteristics is one of the most important aspects of the mining business. The iron ore price is a function of its composition, which is the outcome of the beneficiation process. So, end-to-end integrated planning of mine operations can reduce risks of penalties on the iron ore price. In a standard iron mining company, the production chain is composed of mining, ore beneficiation, and client supply. When mine planning and client supply decisions are made uncoordinated, the beneficiation plant struggles to deliver the best blend possible. Technological improvements in several fields allowed bridging the gap between departments and boosting integrated decision-making processes. Clusterization and classification algorithms over historical production data generate reasonable previsions for quality and volume of iron ore produced for each pile of run-of-mine (ROM) processed. Mathematical modeling can use those deterministic relations to propose iron ore blends that better-fit specifications within a delivery schedule. Additionally, a model capable of representing the whole production chain can clearly compare the overall impact of different decisions in the process. This study shows how flexibilization combined with a planning optimization model between the mine and the ore beneficiation processes can reduce risks of out of specification deliveries. The model capabilities are illustrated on a hypothetical iron ore mine with magnetic separation process. Finally, this study shows ways of cost reduction or profit increase by optimizing process indicators across the production chain and integrating the different plannings with the sales decisions.

Keywords: Clusterization and classification algorithms, integrated planning, optimization, mathematical modeling, penalty minimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 631
4123 Numerical Simulations of Acoustic Imaging in Hydrodynamic Tunnel with Model Adaptation and Boundary Layer Noise Reduction

Authors: Sylvain Amailland, Jean-Hugh Thomas, Charles Pézerat, Romuald Boucheron, Jean-Claude Pascal

Abstract:

The noise requirements for naval and research vessels have seen an increasing demand for quieter ships in order to fulfil current regulations and to reduce the effects on marine life. Hence, new methods dedicated to the characterization of propeller noise, which is the main source of noise in the far-field, are needed. The study of cavitating propellers in closed-section is interesting for analyzing hydrodynamic performance but could involve significant difficulties for hydroacoustic study, especially due to reverberation and boundary layer noise in the tunnel. The aim of this paper is to present a numerical methodology for the identification of hydroacoustic sources on marine propellers using hydrophone arrays in a large hydrodynamic tunnel. The main difficulties are linked to the reverberation of the tunnel and the boundary layer noise that strongly reduce the signal-to-noise ratio. In this paper it is proposed to estimate the reflection coefficients using an inverse method and some reference transfer functions measured in the tunnel. This approach allows to reduce the uncertainties of the propagation model used in the inverse problem. In order to reduce the boundary layer noise, a cleaning algorithm taking advantage of the low rank and sparse structure of the cross-spectrum matrices of the acoustic and the boundary layer noise is presented. This approach allows to recover the acoustic signal even well under the boundary layer noise. The improvement brought by this method is visible on acoustic maps resulting from beamforming and DAMAS algorithms.

Keywords: Acoustic imaging, boundary layer noise denoising, inverse problems, model adaptation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 964
4122 Information Retrieval: Improving Question Answering Systems by Query Reformulation and Answer Validation

Authors: Mohammad Reza Kangavari, Samira Ghandchi, Manak Golpour

Abstract:

Question answering (QA) aims at retrieving precise information from a large collection of documents. Most of the Question Answering systems composed of three main modules: question processing, document processing and answer processing. Question processing module plays an important role in QA systems to reformulate questions. Moreover answer processing module is an emerging topic in QA systems, where these systems are often required to rank and validate candidate answers. These techniques aiming at finding short and precise answers are often based on the semantic relations and co-occurrence keywords. This paper discussed about a new model for question answering which improved two main modules, question processing and answer processing which both affect on the evaluation of the system operations. There are two important components which are the bases of the question processing. First component is question classification that specifies types of question and answer. Second one is reformulation which converts the user's question into an understandable question by QA system in a specific domain. The objective of an Answer Validation task is thus to judge the correctness of an answer returned by a QA system, according to the text snippet given to support it. For validating answers we apply candidate answer filtering, candidate answer ranking and also it has a final validation section by user voting. Also this paper described new architecture of question and answer processing modules with modeling, implementing and evaluating the system. The system differs from most question answering systems in its answer validation model. This module makes it more suitable to find exact answer. Results show that, from total 50 asked questions, evaluation of the model, show 92% improving the decision of the system.

Keywords: Answer processing, answer validation, classification, question answering, query reformulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2836
4121 Optimum Shape and Design of Cooling Towers

Authors: A. M. El Ansary, A. A. El Damatty, A. O. Nassef

Abstract:

The aim of the current study is to develop a numerical tool that is capable of achieving an optimum shape and design of hyperbolic cooling towers based on coupling a non-linear finite element model developed in-house and a genetic algorithm optimization technique. The objective function is set to be the minimum weight of the tower. The geometric modeling of the tower is represented by means of B-spline curves. The finite element method is applied to model the elastic buckling behaviour of a tower subjected to wind pressure and dead load. The study is divided into two main parts. The first part investigates the optimum shape of the tower corresponding to minimum weight assuming constant thickness. The study is extended in the second part by introducing the shell thickness as one of the design variables in order to achieve an optimum shape and design. Design, functionality and practicality constraints are applied.

Keywords: B-splines, Cooling towers, Finite element, Genetic algorithm, Optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3243
4120 Institutional Efficiency of Commonhold Industrial Parks Using a Polynomial Regression Model

Authors: Jeng-Wen Lin, Simon Chien-Yuan Chen

Abstract:

Based on assumptions of neo-classical economics and rational choice / public choice theory, this paper investigates the regulation of industrial land use in Taiwan by homeowners associations (HOAs) as opposed to traditional government administration. The comparison, which applies the transaction cost theory and a polynomial regression analysis, manifested that HOAs are superior to conventional government administration in terms of transaction costs and overall efficiency. A case study that compares Taiwan-s commonhold industrial park, NangKang Software Park, to traditional government counterparts using limited data on the costs and returns was analyzed. This empirical study on the relative efficiency of governmental and private institutions justified the important theoretical proposition. Numerical results prove the efficiency of the established model.

Keywords: Homeowners Associations, Institutional Efficiency, Polynomial Regression, Transaction Cost.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1576
4119 Decision Support System for Hospital Selection in Emergency Medical Services: A Discrete Event Simulation Approach

Authors: D. Tedesco, G. Feletti, P. Trucco

Abstract:

The present study aims to develop a Decision Support System (DSS) to support operational decisions in Emergency Medical Service (EMS) systems regarding the assignment of medical emergency requests to Emergency Departments (ED). This problem is called “hospital selection” and concerns the definition of policies for the selection of the ED to which patients who require further treatment are transported by ambulance. The employed research methodology consists of a first phase of review of the technical-scientific literature concerning DSSs to support the EMS management and, in particular, the hospital selection decision. From the literature analysis, it emerged that current studies mainly focused on the EMS phases related to the ambulance service and consider a process that ends when the ambulance is available after completing a mission. Therefore, all the ED-related issues are excluded and considered as part of a separate process. Indeed, the most studied hospital selection policy turned out to be proximity, thus allowing to minimize the travelling time and to free-up the ambulance in the shortest possible time. The purpose of the present study consists in developing an optimization model for assigning medical emergency requests to the EDs also considering the expected time performance in the subsequent phases of the process, such as the case mix, the expected service throughput times, and the operational capacity of different EDs in hospitals. To this end, a Discrete Event Simulation (DES) model was created to compare different hospital selection policies. The model was implemented with the AnyLogic software and finally validated on a realistic case. The hospital selection policy that returned the best results was the minimization of the Time To Provider (TTP), considered as the time from the beginning of the ambulance journey to the ED at the beginning of the clinical evaluation by the doctor. Finally, two approaches were further compared: a static approach, based on a retrospective estimation of the TTP, and a dynamic approach, focused on a predictive estimation of the TTP which is determined with a constantly updated Winters forecasting model. Findings reveal that considering the minimization of TTP is the best hospital selection policy. It allows to significantly reducing service throughput times in the ED with a negligible increase in travel time. Furthermore, an immediate view of the saturation state of the ED is produced and the case mix present in the ED structures (i.e., the different triage codes) is considered, as different severity codes correspond to different service throughput times. Besides, the use of a predictive approach is certainly more reliable in terms on TTP estimation, than a retrospective approach. These considerations can support decision-makers in introducing different hospital selection policies to enhance EMSs performance.

Keywords: Emergency medical services, hospital selection, discrete event simulation, forecast model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 210
4118 Free Vibration Analysis of Functionally Graded Pretwisted Plate in Thermal Environment Using Finite Element Method

Authors: S. Parida, S. C. Mohanty

Abstract:

The free vibration behavior of thick pretwisted cantilevered functionally graded material (FGM) plate subjected to the thermal environment is investigated numerically in the present paper. A mathematical model is developed in the framework of higher order shear deformation theory (HOST) with C0 finite element formulation i.e. independent displacement and rotations. The material properties are assumed to be temperature dependent and vary continuously through the thickness based on the volume fraction exponent in simple power rule. The finite element model has been discretized into eight node quadratic serendipity elements with node wise seven degrees of freedom. The effect of plate geometry, temperature field, material composition, and the modal analysis on the vibrational characteristics is examined. Finally, the results are verified by comparing with those available in literature.

Keywords: FGM, pretwisted plate, thermal environment, HOST, simple power law.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 777
4117 Theoretical Considerations for Software Component Metrics

Authors: V. Lakshmi Narasimhan, Bayu Hendradjaya

Abstract:

We have defined two suites of metrics, which cover static and dynamic aspects of component assembly. The static metrics measure complexity and criticality of component assembly, wherein complexity is measured using Component Packing Density and Component Interaction Density metrics. Further, four criticality conditions namely, Link, Bridge, Inheritance and Size criticalities have been identified and quantified. The complexity and criticality metrics are combined to form a Triangular Metric, which can be used to classify the type and nature of applications. Dynamic metrics are collected during the runtime of a complete application. Dynamic metrics are useful to identify super-component and to evaluate the degree of utilisation of various components. In this paper both static and dynamic metrics are evaluated using Weyuker-s set of properties. The result shows that the metrics provide a valid means to measure issues in component assembly. We relate our metrics suite with McCall-s Quality Model and illustrate their impact on product quality and to the management of component-based product development.

Keywords: Component Assembly, Component Based SoftwareEngineering, CORBA Component Model, Software ComponentMetrics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2274
4116 Real-Time Land Use and Land Information System in Homagama Divisional Secretariat Division

Authors: Kumara Jayapathma J. H. M. S. S., Dampegama S. D. P. J.

Abstract:

Lands are valuable & limited resource which constantly changes with the growth of the population. An efficient and good land management system is essential to avoid conflicts associated with lands. This paper aims to design the prototype model of a Mobile GIS Land use and Land Information System in real-time. Homagama Divisional Secretariat Division situated in the western province of Sri Lanka was selected as the study area. The prototype model was developed after reviewing related literature. The methodology was consisted of designing and modeling the prototype model into an application running on a mobile platform. The system architecture mainly consists of a Google mapping app for real-time updates with firebase support tools. Thereby, the method of implementation consists of front-end and back-end components. Software tools used in designing applications are Android Studio with JAVA based on GeoJSON File structure. Android Studio with JAVA in GeoJSON File Synchronize to Firebase was found to be the perfect mobile solution for continuously updating Land use and Land Information System (LIS) in real-time in the present scenario. The mobile-based land use and LIS developed in this study are multiple user applications catering to different hierarchy levels such as basic users, supervisory managers, and database administrators. The benefits of this mobile mapping application will help public sector field officers with non-GIS expertise to overcome the land use planning challenges with land use updated in real-time.

Keywords: Android, Firebase, GeoJSON, GIS, JAVA, JSON, LIS, mobile GIS, real-time, REST API.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 543