Search results for: Multinomial dirichlet classification model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8300

Search results for: Multinomial dirichlet classification model

4280 Estimation Model for Concrete Slump Recovery by Using Superplasticizer

Authors: Chaiyakrit Raoupatham, Ram Hari Dhakal, Chalermchai Wanichlamlert

Abstract:

This paper aimed to introduce the solution of concrete slump recovery using chemical admixture type-F (superplasticizer, naphthalene base) to the practice in order to solve unusable concrete problem due to concrete loss its slump, especially for those tropical countries that have faster slump loss rate. In the other hand, randomly adding superplasticizer into concrete can cause concrete to segregate. Therefore, this paper also develops the estimation model used to calculate amount of second dose of superplasticizer need for concrete slump recovery. Fresh properties of ordinary Portland cement concrete with volumetric ratio of paste to void between aggregate (paste content) of 1.1-1.3 with water-cement ratio zone of 0.30 to 0.67 and initial superplasticizer (naphthalene base) of 0.25%-1.6% were tested for initial slump and slump loss for every 30 minutes for one and half hour by slump cone test. Those concretes with slump loss range from 10% to 90% were re-dosed and successfully recovered back to its initial slump. Slump after re-dosed was tested by slump cone test. From the result, it has been concluded that, slump loss was slower for those mix with high initial dose of superplasticizer due to addition of superplasticizer will disturb cement hydration. The required second dose of superplasticizer was affected by two major parameters, which were water-cement ratio and paste content, where lower water-cement ratio and paste content cause an increase in require second dose of superplasticizer. The amount of second dose of superplasticizer is higher as the solid content within the system is increase, solid can be either from cement particles or aggregate. The data was analyzed to form an equation use to estimate the amount of second dosage requirement of superplasticizer to recovery slump to its original.

Keywords: Estimation model, second superplasticizer dosage, slump loss, slump recovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1916
4279 Processes Simulation Study of Coal to Methanol Based on Gasification Technology

Authors: Po-Chuang Chen, Hsiu-Mei Chiu, Yau-Pin Chyou, Chiou-Shia Yu

Abstract:

This study presents a simulation model for converting coal to methanol, based on gasification technology with the commercial chemical process simulator, Pro/II® V8.1.1. The methanol plant consists of air separation unit (ASU), gasification unit, gas clean-up unit, and methanol synthetic unit. The clean syngas is produced with the first three operating units, and the model has been verified with the reference data from United States Environment Protection Agency. The liquid phase methanol (LPMEOHTM) process is adopted in the methanol synthetic unit. Clean syngas goes through gas handing section to reach the reaction requirement, reactor loop/catalyst to generate methanol, and methanol distillation to get desired purity over 99.9 wt%. The ratio of the total energy combined with methanol and dimethyl ether to that of feed coal is 78.5% (gross efficiency). The net efficiency is 64.2% with the internal power consumption taken into account, based on the assumption that the efficiency of electricity generation is 40%.

Keywords: Gasification, Methanol, LPMEOH, System-levelsimulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5359
4278 Use of Data of the Remote Sensing for Spatiotemporal Analysis Land Use Changes in the Eastern Aurès (Algeria)

Authors: A. Bouzekri, H. Benmassaud

Abstract:

Aurèsregion is one of the arid and semi-arid areas that have suffered climate crises and overexploitation of natural resources they have led to significant land degradation. The use of remote sensing data allowed us to analyze the land and its spatiotemporal changes in the Aurès between 1987 and 2013, for this work, we adopted a method of analysis based on the exploitation of the images satellite Landsat TM 1987 and Landsat OLI 2013, from the supervised classification likelihood coupled with field surveys of the mission of May and September of 2013. Using ENVI EX software by the superposition of the ground cover maps from 1987 and 2013, one can extract a spatial map change of different land cover units. The results show that between 1987 and 2013 vegetation has suffered negative changes are the significant degradation of forests and steppe rangelands, and sandy soils and bare land recorded a considerable increase. The spatial change map land cover units between 1987 and 2013 allows us to understand the extensive or regressive orientation of vegetation and soil, this map shows that dense forests give his place to clear forests and steppe vegetation develops from a degraded forest vegetation and bare, sandy soils earn big steppe surfaces that explain its remarkable extension. The analysis of remote sensing data highlights the profound changes in our environment over time and quantitative monitoring of the risk of desertification.

Keywords: Aurès, Land use, remote sensing, spatiotemporal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5032
4277 A Semi-Implicit Phase Field Model for Droplet Evolution

Authors: M. H. Kazemi, D. Salac

Abstract:

A semi-implicit phase field method for droplet evolution is proposed. Using the phase field Cahn-Hilliard equation, we are able to track the interface in multiphase flow. The idea of a semi-implicit finite difference scheme is reviewed and employed to solve two nonlinear equations, including the Navier-Stokes and the Cahn-Hilliard equations. The use of a semi-implicit method allows us to have larger time steps compared to explicit schemes. The governing equations are coupled and then solved by a GMRES solver (generalized minimal residual method) using modified Gram-Schmidt orthogonalization. To show the validity of the method, we apply the method to the simulation of a rising droplet, a leaky dielectric drop and the coalescence of drops. The numerical solutions to the phase field model match well with existing solutions over a defined range of variables.

Keywords: Coalescence, leaky dielectric, numerical method, phase field, rising droplet, semi-implicit method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 879
4276 The Impact of Digital Inclusive Finance on the High-Quality Development of China's Export Trade

Authors: Yao Wu

Abstract:

In the context of financial globalization, China has put forward the policy goal of high-quality development, and the digital economy, with its advantage of information resources, is driving China's export trade to achieve high-quality development. Due to the long-standing financing constraints of small and medium-sized export enterprises, how to expand the export scale of small and medium-sized enterprises has become a major threshold for the development of China's export trade. This paper firstly adopts the hierarchical analysis method to establish the evaluation system of high-quality development of China's export trade; secondly, the panel data of 30 provinces in China from 2011 to 2018 are selected for empirical analysis to establish the impact model of digital inclusive finance on the high-quality development of China's export trade; based on the analysis of the heterogeneous enterprise trade model, a mediating effect model is established to verify the mediating role of credit constraint in the development of high-quality export trade in China. Based on the above analysis, this paper concludes that inclusive digital finance, with its unique digital and inclusive nature, alleviates the credit constraint problem among SMEs, enhances the binary marginal effect of SMEs' exports, optimizes their export scale and structure, and promotes the high-quality development of regional and even national export trade. Finally, based on the findings of this paper, we propose insights and suggestions for inclusive digital finance to promote the high-quality development of export trade.

Keywords: Digital inclusive finance, high-quality development of export trade, fixed effects, binary marginal effects.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 708
4275 Numerical Analysis of Laminar to Turbulent Transition on the DU91-W2-250 Airfoil

Authors: M. Raciti Castelli, G. Grandi, E. Benini

Abstract:

This paper presents a study of laminar to turbulent transition on a profile specifically designed for wind turbine blades, the DU91-W2-250, which belongs to a class of wind turbine dedicated airfoils, developed by Delft University of Technology. A comparison between the experimental behavior of the airfoil studied at Delft wind tunnel and the numerical predictions of the commercial CFD solver ANSYS FLUENT® has been performed. The prediction capabilities of the Spalart-Allmaras turbulence model and of the γ-θ Transitional model have been tested. A sensitivity analysis of the numerical results to the spatial domain discretization has also been performed using four different computational grids, which have been created using the mesher GAMBIT®. The comparison between experimental measurements and CFD results have allowed to determine the importance of the numerical prediction of the laminar to turbulent transition, in order not to overestimate airfoil friction drag due to a fully turbulent-regime flow computation.

Keywords: CFD, wind turbine, DU91-W2-250, laminar to turbulent transition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3070
4274 Numerical Prediction of Bearing Strength on Composite Bolted Joint Using Three Dimensional Puck Failure Criteria

Authors: M. S. Meon, M. N. Rao, K-U. Schröder

Abstract:

Mechanical fasteners especially bolting is commonly used in joining carbon-fiber reinforced polymer (CFRP) composite structures due to their good joinability and easy for maintenance characteristics. Since this approach involves with notching, a proper progressive damage model (PDM) need to be implemented and verified to capture existence of damages in the structure. A three dimensional (3D) failure criteria of Puck is established to predict the ultimate bearing failure of such joint. The failure criteria incorporated with degradation scheme are coded based on user subroutine executed in Abaqus. Single lap joint (SLJ) of composite bolted joint is used as target configuration. The results revealed that the PDM adopted here could sufficiently predict the behaviour of composite bolted joint up to ultimate bearing failure. In addition, mesh refinement near holes increased the accuracy of predicted strength as well as computational effort.

Keywords: Bearing strength, bolted joint, degradation scheme, progressive damage model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1700
4273 Modeling of Compaction Curves for Corn Cob Ash-Cement Stabilized Lateritic Soils

Authors: O. A. Apampa, Y. A. Jimoh, K. A. Olonade

Abstract:

The need to save time and cost of soil testing at the planning stage of road work has necessitated developing predictive models. This study proposes a model for predicting the dry density of lateritic soils stabilized with corn cob ash (CCA) and blended cement - CCA. Lateritic soil was first stabilized with CCA at 1.5, 3.0, 4.5 and 6% of the weight of soil and then stabilized with the same proportions as replacement for cement. Dry density, specific gravity, maximum degree of saturation and moisture content were determined for each stabilized soil specimen, following standard procedure. Polynomial equations containing alpha and beta parameters for CCA and blended CCA-cement were developed. Experimental values were correlated with the values predicted from the Matlab curve fitting tool, and the Solver function of Microsoft Excel 2010. The correlation coefficient (R2) of 0.86 was obtained indicating that the model could be accepted in predicting the maximum dry density of CCA stabilized soils to facilitate quick decision making in roadworks.

Keywords: Corn cob ash, lateritic soil, stabilization, maximum dry density, moisture content.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1701
4272 Integrating Computational Intelligence Techniques and Assessment Agents in ELearning Environments

Authors: Konstantinos C. Giotopoulos, Christos E. Alexakos, Grigorios N. Beligiannis, Spiridon D.Likothanassis

Abstract:

In this contribution an innovative platform is being presented that integrates intelligent agents and evolutionary computation techniques in legacy e-learning environments. It introduces the design and development of a scalable and interoperable integration platform supporting: I) various assessment agents for e-learning environments, II) a specific resource retrieval agent for the provision of additional information from Internet sources matching the needs and profile of the specific user and III) a genetic algorithm designed to extract efficient information (classifying rules) based on the students- answering input data. The agents are implemented in order to provide intelligent assessment services based on computational intelligence techniques such as Bayesian Networks and Genetic Algorithms. The proposed Genetic Algorithm (GA) is used in order to extract efficient information (classifying rules) based on the students- answering input data. The idea of using a GA in order to fulfil this difficult task came from the fact that GAs have been widely used in applications including classification of unknown data. The utilization of new and emerging technologies like web services allows integrating the provided services to any web based legacy e-learning environment.

Keywords: Bayesian Networks, Computational Intelligencetechniques, E-learning legacy systems, Service Oriented Integration, Intelligent Agents, Genetic Algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1744
4271 Monomial Form Approach to Rectangular Surface Modeling

Authors: Taweechai Nuntawisuttiwong, Natasha Dejdumrong

Abstract:

Geometric modeling plays an important role in the constructions and manufacturing of curve, surface and solid modeling. Their algorithms are critically important not only in the automobile, ship and aircraft manufacturing business, but are also absolutely necessary in a wide variety of modern applications, e.g., robotics, optimization, computer vision, data analytics and visualization. The calculation and display of geometric objects can be accomplished by these six techniques: Polynomial basis, Recursive, Iterative, Coefficient matrix, Polar form approach and Pyramidal algorithms. In this research, the coefficient matrix (simply called monomial form approach) will be used to model polynomial rectangular patches, i.e., Said-Ball, Wang-Ball, DP, Dejdumrong and NB1 surfaces. Some examples of the monomial forms for these surface modeling are illustrated in many aspects, e.g., construction, derivatives, model transformation, degree elevation and degress reduction.

Keywords: Monomial form, rectangular surfaces, CAGD curves, monomial matrix applications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 705
4270 The Location of Park and Ride Facilities Using the Fuzzy Inference Model

Authors: Anna Lower, Michal Lower, Robert Masztalski, Agnieszka Szumilas

Abstract:

The paper presents a method in which the expert knowledge is applied to fuzzy inference model. Even a less experienced person could benefit from the use of such a system, e.g. urban planners, officials. The analysis result is obtained in a very short time, so a large number of the proposed locations can also be verified in a short time. The proposed method is intended for testing of locations of car parks in a city. The paper shows selected examples of locations of the P&R facilities in cities planning to introduce the P&R. The analyses of existing objects are also shown in the paper and they are confronted with the opinions of the system users, with particular emphasis on unpopular locations. The results of the analyses are compared to expert analysis of the P&R facilities location that was outsourced by the city and the opinions about existing facilities users that were expressed on social networking sites. The obtained results are consistent with actual users’ feedback. The proposed method proves to be good, but does not require the involvement of a large experts team and large financial contributions for complicated research. The method also provides an opportunity to show the alternative location of P&R facilities. Although the results of the method are approximate, they are not worse than results of analysis of employed experts. The advantage of this method is ease of use, which simplifies the professional expert analysis. The ability of analyzing a large number of alternative locations gives a broader view on the problem. It is valuable that the arduous analysis of the team of people can be replaced by the model's calculation. According to the authors, the proposed method is also suitable for implementation on a GIS platform.

Keywords: Fuzzy logic inference, P&R facilities, P&R location.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1645
4269 Feature Selection and Predictive Modeling of Housing Data Using Random Forest

Authors: Bharatendra Rai

Abstract:

Predictive data analysis and modeling involving machine learning techniques become challenging in presence of too many explanatory variables or features. Presence of too many features in machine learning is known to not only cause algorithms to slow down, but they can also lead to decrease in model prediction accuracy. This study involves housing dataset with 79 quantitative and qualitative features that describe various aspects people consider while buying a new house. Boruta algorithm that supports feature selection using a wrapper approach build around random forest is used in this study. This feature selection process leads to 49 confirmed features which are then used for developing predictive random forest models. The study also explores five different data partitioning ratios and their impact on model accuracy are captured using coefficient of determination (r-square) and root mean square error (rsme).

Keywords: Housing data, feature selection, random forest, Boruta algorithm, root mean square error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1716
4268 Fabless Prototyping Methodology for the Development of SOI based MEMS Microgripper

Authors: H. M. Usman Sani, Shafaat A. Bazaz, Nisar Ahmed

Abstract:

In this paper, Fabless Prototyping Methodology is introduced for the design and analysis of MEMS devices. Conventionally Finite Element Analysis (FEA) is performed before system level simulation. In our proposed methodology, system level simulation is performed earlier than FEA as it is computationally less extensive and low cost. System level simulations are based on equivalent behavioral models of MEMS device. Electrostatic actuation based MEMS Microgripper is chosen as case study to implement this methodology. This paper addresses the behavioral model development and simulation of actuator part of an electrostatically actuated Microgripper. Simulation results show that the actuator part of Microgripper works efficiently for a voltage range of 0-45V with the corresponding jaw displacement of 0-4.5425μm. With some minor changes in design, this range can be enhanced to 15μm at 85V.

Keywords: MEMS Actuator, Behavioral Model, CoventorWare, Microgripper, SOIMUMPs, System Level Simulation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2284
4267 A Background Subtraction Based Moving Object Detection around the Host Vehicle

Authors: Hyojin Lim, Cuong Nguyen Khac, Ho-Youl Jung

Abstract:

In this paper, we propose moving object detection method which is helpful for driver to safely take his/her car out of parking lot. When moving objects such as motorbikes, pedestrians, the other cars and some obstacles are detected at the rear-side of host vehicle, the proposed algorithm can provide to driver warning. We assume that the host vehicle is just before departure. Gaussian Mixture Model (GMM) based background subtraction is basically applied. Pre-processing such as smoothing and post-processing as morphological filtering are added. We examine “which color space has better performance for detection of moving objects?” Three color spaces including RGB, YCbCr, and Y are applied and compared, in terms of detection rate. Through simulation, we prove that RGB space is more suitable for moving object detection based on background subtraction.

Keywords: Gaussian mixture model, background subtraction, Moving object detection, color space, morphological filtering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2556
4266 Structural Cost of Optimized Reinforced Concrete Isolated Footing

Authors: Mohammed S. Al-Ansari

Abstract:

This paper presents an analytical model to estimate the cost of an optimized design of reinforced concrete isolated footing base on structural safety. Flexural and optimized formulas for square and rectangular footingare derived base on ACI building code of design, material cost and optimization. The optimization constraints consist of upper and lower limits of depth and area of steel. Footing depth and area of reinforcing steel are to be minimized to yield the optimal footing dimensions. Optimized footing materials cost of concrete, reinforcing steel and formwork of the designed sections are computed. Total cost factor TCF and other cost factors are developed to generalize and simplify the calculations of footing material cost. Numerical examples are presented to illustrate the model capability of estimating the material cost of the footing for a desired axial load.

Keywords: Footing, Depth, Concrete, Steel, Formwork, Optimization, Material cost, Cost Factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4718
4265 Numerical Simulations of Acoustic Imaging in Hydrodynamic Tunnel with Model Adaptation and Boundary Layer Noise Reduction

Authors: Sylvain Amailland, Jean-Hugh Thomas, Charles Pézerat, Romuald Boucheron, Jean-Claude Pascal

Abstract:

The noise requirements for naval and research vessels have seen an increasing demand for quieter ships in order to fulfil current regulations and to reduce the effects on marine life. Hence, new methods dedicated to the characterization of propeller noise, which is the main source of noise in the far-field, are needed. The study of cavitating propellers in closed-section is interesting for analyzing hydrodynamic performance but could involve significant difficulties for hydroacoustic study, especially due to reverberation and boundary layer noise in the tunnel. The aim of this paper is to present a numerical methodology for the identification of hydroacoustic sources on marine propellers using hydrophone arrays in a large hydrodynamic tunnel. The main difficulties are linked to the reverberation of the tunnel and the boundary layer noise that strongly reduce the signal-to-noise ratio. In this paper it is proposed to estimate the reflection coefficients using an inverse method and some reference transfer functions measured in the tunnel. This approach allows to reduce the uncertainties of the propagation model used in the inverse problem. In order to reduce the boundary layer noise, a cleaning algorithm taking advantage of the low rank and sparse structure of the cross-spectrum matrices of the acoustic and the boundary layer noise is presented. This approach allows to recover the acoustic signal even well under the boundary layer noise. The improvement brought by this method is visible on acoustic maps resulting from beamforming and DAMAS algorithms.

Keywords: Acoustic imaging, boundary layer noise denoising, inverse problems, model adaptation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 975
4264 Methods for Better Assessment of Fatigue and Deterioration in Bridges and Other Steel or Concrete Constructions

Authors: J. Menčík, B. Culek, Jr., L. Beran, J. Mareš

Abstract:

Large metal and concrete structures suffer by various kinds of deterioration, and accurate prediction of the remaining life is important. This paper informs about two methods for its assessment. One method, suitable for steel bridges and other constructions exposed to fatigue, monitors the loads and damage accumulation using information systems for the operation and the finite element model of the construction. In addition to the operation load, the dead weight of the construction and thermal stresses can be included into the model. The second method is suitable for concrete bridges and other structures, which suffer by carbonatation and other degradation processes, driven by diffusion. The diffusion constant, important for the prediction of future development, can be determined from the depth-profile of pH, obtained by pH measurement at various depths. Comparison with measurements on real objects illustrates the suitability of both methods.

Keywords: Bridges, carbonatation, concrete, diagnostics, fatigue, life prediction, monitoring, railway, simulation, structures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2013
4263 Atomic Clusters: A Unique Building Motif for Future Smart Nanomaterials

Authors: Debesh R. Roy

Abstract:

The fundamental issue in understanding the origin and growth mechanism of nanomaterials, from a fundamental unit is a big challenging problem to the scientists. Recently, an immense attention is generated to the researchers for prediction of exceptionally stable atomic cluster units as the building units for future smart materials. The present study is a systematic investigation on the stability and electronic properties of a series of bimetallic (semiconductor-alkaline earth) clusters, viz., BxMg3 (x=1-5) is performed, in search for exceptional and/ or unusual stable motifs. A very popular hybrid exchange-correlation functional, B3LYP along with a higher basis set, viz., 6-31+G[d,p] is employed for this purpose under the density functional formalism. The magic stability among the concerned clusters is explained using the jellium model. It is evident from the present study that the magic stability of B4Mg3 cluster arises due to the jellium shell closure.

Keywords: Atomic Clusters, Density Functional Theory, Jellium Model, Magic Clusters, Smart Nanomaterials.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2244
4262 Optimum Shape and Design of Cooling Towers

Authors: A. M. El Ansary, A. A. El Damatty, A. O. Nassef

Abstract:

The aim of the current study is to develop a numerical tool that is capable of achieving an optimum shape and design of hyperbolic cooling towers based on coupling a non-linear finite element model developed in-house and a genetic algorithm optimization technique. The objective function is set to be the minimum weight of the tower. The geometric modeling of the tower is represented by means of B-spline curves. The finite element method is applied to model the elastic buckling behaviour of a tower subjected to wind pressure and dead load. The study is divided into two main parts. The first part investigates the optimum shape of the tower corresponding to minimum weight assuming constant thickness. The study is extended in the second part by introducing the shell thickness as one of the design variables in order to achieve an optimum shape and design. Design, functionality and practicality constraints are applied.

Keywords: B-splines, Cooling towers, Finite element, Genetic algorithm, Optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3258
4261 Institutional Efficiency of Commonhold Industrial Parks Using a Polynomial Regression Model

Authors: Jeng-Wen Lin, Simon Chien-Yuan Chen

Abstract:

Based on assumptions of neo-classical economics and rational choice / public choice theory, this paper investigates the regulation of industrial land use in Taiwan by homeowners associations (HOAs) as opposed to traditional government administration. The comparison, which applies the transaction cost theory and a polynomial regression analysis, manifested that HOAs are superior to conventional government administration in terms of transaction costs and overall efficiency. A case study that compares Taiwan-s commonhold industrial park, NangKang Software Park, to traditional government counterparts using limited data on the costs and returns was analyzed. This empirical study on the relative efficiency of governmental and private institutions justified the important theoretical proposition. Numerical results prove the efficiency of the established model.

Keywords: Homeowners Associations, Institutional Efficiency, Polynomial Regression, Transaction Cost.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1582
4260 Decision Support System for Hospital Selection in Emergency Medical Services: A Discrete Event Simulation Approach

Authors: D. Tedesco, G. Feletti, P. Trucco

Abstract:

The present study aims to develop a Decision Support System (DSS) to support operational decisions in Emergency Medical Service (EMS) systems regarding the assignment of medical emergency requests to Emergency Departments (ED). This problem is called “hospital selection” and concerns the definition of policies for the selection of the ED to which patients who require further treatment are transported by ambulance. The employed research methodology consists of a first phase of review of the technical-scientific literature concerning DSSs to support the EMS management and, in particular, the hospital selection decision. From the literature analysis, it emerged that current studies mainly focused on the EMS phases related to the ambulance service and consider a process that ends when the ambulance is available after completing a mission. Therefore, all the ED-related issues are excluded and considered as part of a separate process. Indeed, the most studied hospital selection policy turned out to be proximity, thus allowing to minimize the travelling time and to free-up the ambulance in the shortest possible time. The purpose of the present study consists in developing an optimization model for assigning medical emergency requests to the EDs also considering the expected time performance in the subsequent phases of the process, such as the case mix, the expected service throughput times, and the operational capacity of different EDs in hospitals. To this end, a Discrete Event Simulation (DES) model was created to compare different hospital selection policies. The model was implemented with the AnyLogic software and finally validated on a realistic case. The hospital selection policy that returned the best results was the minimization of the Time To Provider (TTP), considered as the time from the beginning of the ambulance journey to the ED at the beginning of the clinical evaluation by the doctor. Finally, two approaches were further compared: a static approach, based on a retrospective estimation of the TTP, and a dynamic approach, focused on a predictive estimation of the TTP which is determined with a constantly updated Winters forecasting model. Findings reveal that considering the minimization of TTP is the best hospital selection policy. It allows to significantly reducing service throughput times in the ED with a negligible increase in travel time. Furthermore, an immediate view of the saturation state of the ED is produced and the case mix present in the ED structures (i.e., the different triage codes) is considered, as different severity codes correspond to different service throughput times. Besides, the use of a predictive approach is certainly more reliable in terms on TTP estimation, than a retrospective approach. These considerations can support decision-makers in introducing different hospital selection policies to enhance EMSs performance.

Keywords: Emergency medical services, hospital selection, discrete event simulation, forecast model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 233
4259 Free Vibration Analysis of Functionally Graded Pretwisted Plate in Thermal Environment Using Finite Element Method

Authors: S. Parida, S. C. Mohanty

Abstract:

The free vibration behavior of thick pretwisted cantilevered functionally graded material (FGM) plate subjected to the thermal environment is investigated numerically in the present paper. A mathematical model is developed in the framework of higher order shear deformation theory (HOST) with C0 finite element formulation i.e. independent displacement and rotations. The material properties are assumed to be temperature dependent and vary continuously through the thickness based on the volume fraction exponent in simple power rule. The finite element model has been discretized into eight node quadratic serendipity elements with node wise seven degrees of freedom. The effect of plate geometry, temperature field, material composition, and the modal analysis on the vibrational characteristics is examined. Finally, the results are verified by comparing with those available in literature.

Keywords: FGM, pretwisted plate, thermal environment, HOST, simple power law.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 788
4258 Theoretical Considerations for Software Component Metrics

Authors: V. Lakshmi Narasimhan, Bayu Hendradjaya

Abstract:

We have defined two suites of metrics, which cover static and dynamic aspects of component assembly. The static metrics measure complexity and criticality of component assembly, wherein complexity is measured using Component Packing Density and Component Interaction Density metrics. Further, four criticality conditions namely, Link, Bridge, Inheritance and Size criticalities have been identified and quantified. The complexity and criticality metrics are combined to form a Triangular Metric, which can be used to classify the type and nature of applications. Dynamic metrics are collected during the runtime of a complete application. Dynamic metrics are useful to identify super-component and to evaluate the degree of utilisation of various components. In this paper both static and dynamic metrics are evaluated using Weyuker-s set of properties. The result shows that the metrics provide a valid means to measure issues in component assembly. We relate our metrics suite with McCall-s Quality Model and illustrate their impact on product quality and to the management of component-based product development.

Keywords: Component Assembly, Component Based SoftwareEngineering, CORBA Component Model, Software ComponentMetrics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2281
4257 Real-Time Land Use and Land Information System in Homagama Divisional Secretariat Division

Authors: Kumara Jayapathma J. H. M. S. S., Dampegama S. D. P. J.

Abstract:

Lands are valuable & limited resource which constantly changes with the growth of the population. An efficient and good land management system is essential to avoid conflicts associated with lands. This paper aims to design the prototype model of a Mobile GIS Land use and Land Information System in real-time. Homagama Divisional Secretariat Division situated in the western province of Sri Lanka was selected as the study area. The prototype model was developed after reviewing related literature. The methodology was consisted of designing and modeling the prototype model into an application running on a mobile platform. The system architecture mainly consists of a Google mapping app for real-time updates with firebase support tools. Thereby, the method of implementation consists of front-end and back-end components. Software tools used in designing applications are Android Studio with JAVA based on GeoJSON File structure. Android Studio with JAVA in GeoJSON File Synchronize to Firebase was found to be the perfect mobile solution for continuously updating Land use and Land Information System (LIS) in real-time in the present scenario. The mobile-based land use and LIS developed in this study are multiple user applications catering to different hierarchy levels such as basic users, supervisory managers, and database administrators. The benefits of this mobile mapping application will help public sector field officers with non-GIS expertise to overcome the land use planning challenges with land use updated in real-time.

Keywords: Android, Firebase, GeoJSON, GIS, JAVA, JSON, LIS, mobile GIS, real-time, REST API.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 565
4256 The Entrepreneur's General Personality Traits and Technological Developments

Authors: Bostjan Antoncic

Abstract:

Technological newness and innovativeness are important aspects of small firm development, growth and wealth creation. The contribution of the study to entrepreneurship personality research and to technology-related research in entrepreneurship is that the model of the general personality driven technological development was developed and empirically tested. Hypotheses relating the big five personality factors (OCEAN: openness, conscientiousness, extraversion, agreeableness, and neuroticism) and technological developments were tested by using multiple regression analysis on survey data from a sample of 160 entrepreneurs from Slovenia. The model reveals two personality factors, which are predictive of technological developments: openness (positive impact) and neuroticism (negative impact). In addition, a positive impact of firm age on technological developments was found. Other personality factors (conscientiousness, extraversion and agreeableness) of entrepreneurs may not be considered important for their firm technological developments.

Keywords: Big five factors, entrepreneur, personality, technology development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3157
4255 Conflation Methodology Applied to Flood Recovery

Authors: E. L. Suarez, D. E. Meeroff, Y. Yong

Abstract:

Current flooding risk modeling focuses on resilience, defined as the probability of recovery from a severe flooding event. However, the long-term damage to property and well-being by nuisance flooding and its long-term effects on communities are not typically included in risk assessments. An approach was developed to address the probability of recovering from a severe flooding event combined with the probability of community performance during a nuisance event. A consolidated model, namely the conflation flooding recovery (&FR) model, evaluates risk-coping mitigation strategies for communities based on the recovery time from catastrophic events, such as hurricanes or extreme surges, and from everyday nuisance flooding events. The &FR model assesses the variation contribution of each independent input and generates a weighted output that favors the distribution with minimum variation. This approach is especially useful if the input distributions have dissimilar variances. The &FR is defined as a single distribution resulting from the product of the individual probability density functions. The resulting conflated distribution resides between the parent distributions, and it infers the recovery time required by a community to return to basic functions, such as power, utilities, transportation, and civil order, after a flooding event. The &FR model is more accurate than averaging individual observations before calculating the mean and variance or averaging the probabilities evaluated at the input values, which assigns the same weighted variation to each input distribution. The main disadvantage of these traditional methods is that the resulting measure of central tendency is exactly equal to the average of the input distribution’s means without the additional information provided by each individual distribution variance. When dealing with exponential distributions, such as resilience from severe flooding events and from nuisance flooding events, conflation results are equivalent to the weighted least squares method or best linear unbiased estimation. The combination of severe flooding risk with nuisance flooding improves flood risk management for highly populated coastal communities, such as in South Florida, USA, and provides a method to estimate community flood recovery time more accurately from two different sources, severe flooding events and nuisance flooding events.

Keywords: Community resilience, conflation, flood risk, nuisance flooding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 144
4254 Flight Control of TUAV with Coaxial Rotor and Ducted Fan Configuration by NARMA-L2 Controllers for Enhanced Situational Awareness

Authors: Igor Astrov, Andrus Pedai, Boris Gordon

Abstract:

This paper focuses on a critical component of the situational awareness (SA), the control of autonomous vertical flight for tactical unmanned aerial vehicle (TUAV). With the SA strategy, we proposed a two stage flight control procedure using two autonomous control subsystems to address the dynamics variation and performance requirement difference in initial and final stages of flight trajectory for an unmanned helicopter model with coaxial rotor and ducted fan configuration. This control strategy for chosen model of TUAV has been verified by simulation of hovering maneuvers using software package Simulink and demonstrated good performance for fast stabilization of engines in hovering, consequently, fast SA with economy in energy can be asserted during search-and-rescue operations.

Keywords: Coaxial rotors, ducted fan, NARMA-L2 neurocontroller, situational awareness, tactical unmanned aerial vehicle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2313
4253 Functional Store Image and Corporate Social Responsibility Image: A Congruity Analysis on Store Loyalty

Authors: Jamaliah Mohd. Yusof, Rosidah Musa, Sofiah Abd. Rahman

Abstract:

With previous studies that examined the importance of functional store image and CSR, this study is aimed at examining their effects in the self-congruity model in influencing store loyalty. In particular, this study developed and tested a structural model in the context of retailing industry on the self-congruity theory. Whilst much of the self-congruity studies have incorporated functional store image, there has been lack of studies that examined social responsibility image of retail stores in the self-congruity studies. Findings indicate that self-congruity influence on store loyalty was mediated by both functional store image and social responsibility image. In influencing store loyalty, the findings have shown that social responsibility image has a stronger influence on store loyalty than functional store image. This study offers important findings and implications for future research as it presents a new framework on the importance of social responsibility image.

Keywords: Self-congruity, functional store image, social responsibility image, store loyalty

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2344
4252 Ultrasound Mechanical Index as a Parameter Affecting of the Ability of Proliferation of Cells

Authors: Z. Hormozi Moghaddam, M. Mokhtari-Dizaji, M. Movahedin, M. E. Ravari

Abstract:

Mechanical index (MI) is used for quantifying acoustic cavitation and the relationship between acoustic pressure and the frequency. In this study, modeling of the MI was applied to provide treatment protocol and to understand the effective physical processes on reproducibility of stem cells. The acoustic pressure and MI equations are modeled and solved to estimate optimal MI for 28, 40, 150 kHz and 1 MHz frequencies. Radial and axial acoustic pressure distribution was extracted. To validate the results of the modeling, the acoustic pressure in the water and near field depth was measured by a piston hydrophone. Results of modeling and experiments show that the model is consistent well to experimental results with 0.91 and 0.90 correlation of coefficient (p<0.05) for 1 MHz and 40 kHz. Low intensity ultrasound with 0.40 MI is more effective on the proliferation rate of the spermatogonial stem cells during the seven days of culture, in contrast, high MI has a harmful effect on the spermatogonial stem cells. This model provides proper treatment planning in vitro and in vivo by estimating the cavitation phenomenon.

Keywords: Ultrasound, mechanical index, modeling, stem cell.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 961
4251 Complex Dynamics of Bertrand Duopoly Games with Bounded Rationality

Authors: Jixiang Zhang, Guocheng Wang

Abstract:

A dynamic of Bertrand duopoly game is analyzed, where players use different production methods and choose their prices with bounded rationality. The equilibriums of the corresponding discrete dynamical systems are investigated. The stability conditions of Nash equilibrium under a local adjustment process are studied. The stability conditions of Nash equilibrium under a local adjustment process are studied. The stability of Nash equilibrium, as some parameters of the model are varied, gives rise to complex dynamics such as cycles of higher order and chaos. On this basis, we discover that an increase of adjustment speed of bounded rational player can make Bertrand market sink into the chaotic state. Finally, the complex dynamics, bifurcations and chaos are displayed by numerical simulation.

Keywords: Bertrand duopoly model, Discrete dynamical system, Heterogeneous expectations, Nash equilibrium.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2600