Search results for: business process model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11916

Search results for: business process model

7206 A New Model to Perform Preliminary Evaluations of Complex Systems for the Production of Energy for Buildings: Case Study

Authors: Roberto de Lieto Vollaro, Emanuele de Lieto Vollaro, Gianluca Coltrinari

Abstract:

The building sector is responsible, in many industrialized countries, for about 40% of the total energy requirements, so it seems necessary to devote some efforts in this area in order to achieve a significant reduction of energy consumption and of greenhouse gases emissions. The paper presents a study aiming at providing a design methodology able to identify the best configuration of the system building/plant, from a technical, economic and environmentally point of view. Normally, the classical approach involves a building's energy loads analysis under steady state conditions, and subsequent selection of measures aimed at improving the energy performance, based on previous experience made by architects and engineers in the design team. Instead, the proposed approach uses a sequence of two wellknown scientifically validated calculation methods (TRNSYS and RETScreen), that allow quite a detailed feasibility analysis. To assess the validity of the calculation model, an existing, historical building in Central Italy, that will be the object of restoration and preservative redevelopment, was selected as a casestudy. The building is made of a basement and three floors, with a total floor area of about 3,000 square meters. The first step has been the determination of the heating and cooling energy loads of the building in a dynamic regime by means, which allows simulating the real energy needs of the building in function of its use. Traditional methodologies, based as they are on steady-state conditions, cannot faithfully reproduce the effects of varying climatic conditions and of inertial properties of the structure. With this model is possible to obtain quite accurate and reliable results that allow identifying effective combinations building-HVAC system. The second step has consisted of using output data obtained as input to the calculation model, which enables to compare different system configurations from the energy, environmental and financial point of view, with an analysis of investment, and operation and maintenance costs, so allowing determining the economic benefit of possible interventions. The classical methodology often leads to the choice of conventional plant systems, while our calculation model provides a financial-economic assessment for innovative energy systems and low environmental impact. Computational analysis can help in the design phase, particularly in the case of complex structures with centralized plant systems, by comparing the data returned by the calculation model for different design options.

Keywords: Energy, Buildings, Systems, Evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2014
7205 A Comparison of Single of Decision Tree, Decision Tree Forest and Group Method of Data Handling to Evaluate the Surface Roughness in Machining Process

Authors: S. Ghorbani, N. I. Polushin

Abstract:

The machinability of workpieces (AISI 1045 Steel, AA2024 aluminum alloy, A48-class30 gray cast iron) in turning operation has been carried out using different types of cutting tool (conventional, cutting tool with holes in toolholder and cutting tool filled up with composite material) under dry conditions on a turning machine at different stages of spindle speed (630-1000 rpm), feed rate (0.05-0.075 mm/rev), depth of cut (0.05-0.15 mm) and tool overhang (41-65 mm). Experimentation was performed as per Taguchi’s orthogonal array. To evaluate the relative importance of factors affecting surface roughness the single decision tree (SDT), Decision tree forest (DTF) and Group method of data handling (GMDH) were applied.

Keywords: Decision Tree Forest, GMDH, surface roughness, taguchi method, turning process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 934
7204 Biological Soil Conservation Planning by Spatial Multi-Criteria Evaluation Techniques (Case Study: Bonkuh Watershed in Iran)

Authors: Ali Akbar Jamali

Abstract:

This paper discusses site selection process for biological soil conservation planning. It was supported by a valuefocused approach and spatial multi-criteria evaluation techniques. A first set of spatial criteria was used to design a number of potential sites. Next, a new set of spatial and non-spatial criteria was employed, including the natural factors and the financial costs, together with the degree of suitability for the Bonkuh watershed to biological soil conservation planning and to recommend the most acceptable program. The whole process was facilitated by a new software tool that supports spatial multiple criteria evaluation, or SMCE in GIS software (ILWIS). The application of this tool, combined with a continual feedback by the public attentions, has provided an effective methodology to solve complex decisional problem in biological soil conservation planning.

Keywords: GIS, Biological soil conservation planning, Spatial multi-criteria evaluation, Iran

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1706
7203 Towards Modeling for Crashes A Low-Cost Adaptive Methodology for Karachi

Authors: Mohammad Ahmed Rehmatullah

Abstract:

The aim of this paper is to discuss a low-cost methodology that can predict traffic flow conflicts and quantitatively rank crash expectancies (based on relative probability) for various traffic facilities. This paper focuses on the application of statistical distributions to model traffic flow and Monte Carlo techniques to simulate traffic and discusses how to create a tool in order to predict the possibility of a traffic crash. A low-cost data collection methodology has been discussed for the heterogeneous traffic flow that exists and a GIS platform has been proposed to thematically represent traffic flow from simulations and the probability of a crash. Furthermore, discussions have been made to reflect the dynamism of the model in reference to its adaptability, adequacy, economy, and efficiency to ensure adoption.

Keywords: Heterogeneous traffic data collection, Monte CarloSimulation, Traffic Flow Modeling, GIS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1414
7202 Modeling the Moment of Resistance Generated by an Ore-Grinding Mill

Authors: Marinka Baghdasaryan, Tigran Mnoyan

Abstract:

The pertinence of modeling the moment of resistance generated by the ore-grinding mill is substantiated. Based on the ranking of technological indices obtained in the result of the survey among the specialists of several beneficiating plants, the factors determining the level of the moment of resistance generated by the mill are revealed. A priori diagram of the ranks is obtained in which the factors are arranged in the descending order of the impact degree on the level of the moment. The obtained model of the moment of resistance shows the technological character of the operation modes of the ore-grinding mill and can be used for improving the operation modes of the system motor-mill and preventing the abnormal mode of the drive synchronous motor.

Keywords: Model, abnormal mode, mill, correlation, moment of resistance, rotational speed.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 958
7201 Application of Artificial Neural Network for Predicting Maintainability Using Object-Oriented Metrics

Authors: K. K. Aggarwal, Yogesh Singh, Arvinder Kaur, Ruchika Malhotra

Abstract:

Importance of software quality is increasing leading to development of new sophisticated techniques, which can be used in constructing models for predicting quality attributes. One such technique is Artificial Neural Network (ANN). This paper examined the application of ANN for software quality prediction using Object- Oriented (OO) metrics. Quality estimation includes estimating maintainability of software. The dependent variable in our study was maintenance effort. The independent variables were principal components of eight OO metrics. The results showed that the Mean Absolute Relative Error (MARE) was 0.265 of ANN model. Thus we found that ANN method was useful in constructing software quality model.

Keywords: Software quality, Measurement, Metrics, Artificial neural network, Coupling, Cohesion, Inheritance, Principal component analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2557
7200 A Simple Approach of Three phase Distribution System Modeling for Power Flow Calculations

Authors: J. B. V. Subrahmanyam, C. Radhakrishna

Abstract:

This paper presents a simple three phase power flow method for solution of three-phase unbalanced radial distribution system (RDN) with voltage dependent loads. It solves a simple algebraic recursive expression of voltage magnitude, and all the data are stored in vector form. The algorithm uses basic principles of circuit theory and can be easily understood. Mutual coupling between the phases has been included in the mathematical model. The proposed algorithm has been tested with several unbalanced radial distribution networks and the results are presented in the article. 8- bus and IEEE 13 bus unbalanced radial distribution system results are in agreements with the literature and show that the proposed model is valid and reliable.

Keywords: radial distribution networks, load flow, circuitmodel, three-phase four-wire, unbalance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3974
7199 Air Pollution and Respiratory-Related Restricted Activity Days in Tunisia

Authors: Mokhtar Kouki Inès Rekik

Abstract:

This paper focuses on the assessment of the air pollution and morbidity relationship in Tunisia. Air pollution is measured by ozone air concentration and the morbidity is measured by the number of respiratory-related restricted activity days during the 2-week period prior to the interview. Socioeconomic data are also collected in order to adjust for any confounding covariates. Our sample is composed by 407 Tunisian respondents; 44.7% are women, the average age is 35.2, near 69% are living in a house built after 1980, and 27.8% have reported at least one day of respiratory-related restricted activity. The model consists on the regression of the number of respiratory-related restricted activity days on the air quality measure and the socioeconomic covariates. In order to correct for zero-inflation and heterogeneity, we estimate several models (Poisson, negative binomial, zero inflated Poisson, Poisson hurdle, negative binomial hurdle and finite mixture Poisson models). Bootstrapping and post-stratification techniques are used in order to correct for any sample bias. According to the Akaike information criteria, the hurdle negative binomial model has the greatest goodness of fit. The main result indicates that, after adjusting for socioeconomic data, the ozone concentration increases the probability of positive number of restricted activity days.

Keywords: Bootstrapping, hurdle negbin model, overdispersion, ozone concentration, respiratory-related restricted activity days.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2115
7198 The DAQ Debugger for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.

Keywords: DAQ debugger, data acquisition system, FPGA, system signals, Qt framework.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 877
7197 Vocational Skills, Recognition of Prior Learning and Technology: The Future of Higher Education

Authors: Shankar Subramanian Iyer

Abstract:

The vocational education, enhanced by technology and Recognition of Prior Learning (RPL) is going to be the main ingredient of the future of education. This is coming from the various issues of the current educational system like cost, time, type of course, type of curriculum, unemployment, to name the major reasons. Most millennials like to perform and learn rather than learning how to perform. This is the essence of vocational education be it any field from cooking, painting, plumbing to modern technologies using computers. Even a more theoretical course like entrepreneurship can be taught as to be an entrepreneur and learn about its nuances. The best way to learn accountancy is actually keeping accounts for a small business or grocer and learn the ropes of accountancy and finance. The purpose of this study is to investigate the relationship between vocational skills, RPL and new technologies with future employability. This study implies that individual's knowledge and skills are essential aspects to be emphasized in future education and to give credit for prior experience for future employability. Virtual reality can be used to stimulate workplace situations for vocational learning for fields like hospitality, medical emergencies, healthcare, draughtsman ship, building inspection, quantity surveying, estimation, to name a few. All disruptions in future education, especially vocational education, are going to be technology driven with the advent of AI, ML, IoT, VR, VI etc. Vocational education not only helps institutes cut costs drastically, but allows all students to have hands-on experiences, rather than to be observers. The earlier experiential learning theory and the recent theory of knowledge and skills-based learning modified and applied to the vocational education and development of skills is the proposed contribution of this paper. Apart from secondary research study on major scholarly articles, books, primary research using interviews, questionnaire surveys have been used to validate and test the reliability of the suggested model using Partial Least Square- Structural Equation Method (PLS-SEM), the factors being assimilated using an existing literature review. Major findings have been that there exists high relationship between the vocational skills, RPL, new technology to the future employability through mediation of future employability skills.

Keywords: Vocational education, vocational skills, competencies, modern technologies, Recognition of Prior Learning, RPL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 731
7196 Revisiting the Concept of Risk Analysis within the Context of Geospatial Database Design: A Collaborative Framework

Authors: J. Grira, Y. Bédard, S. Roche

Abstract:

The aim of this research is to design a collaborative framework that integrates risk analysis activities into the geospatial database design (GDD) process. Risk analysis is rarely undertaken iteratively as part of the present GDD methods in conformance to requirement engineering (RE) guidelines and risk standards. Accordingly, when risk analysis is performed during the GDD, some foreseeable risks may be overlooked and not reach the output specifications especially when user intentions are not systematically collected. This may lead to ill-defined requirements and ultimately in higher risks of geospatial data misuse. The adopted approach consists of 1) reviewing risk analysis process within the scope of RE and GDD, 2) analyzing the challenges of risk analysis within the context of GDD, and 3) presenting the components of a risk-based collaborative framework that improves the collection of the intended/forbidden usages of the data and helps geo-IT experts to discover implicit requirements and risks.

Keywords: Collaborative risk analysis, intention of use, Geospatial database design, Geospatial data misuse.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1644
7195 Daily and Seasonal Changes of Air Pollution in Kuwait

Authors: H. Ettouney, A. AL-Haddad, S. Saqer

Abstract:

This paper focuses on assessment of air pollution in Umm-Alhyman, Kuwait, which is located south to oil refineries, power station, oil field, and highways. The measurements were made over a period of four days in March and July in 2001, 2004, and 2008. The measured pollutants included methanated and nonmethanated hydrocarbons (MHC, NMHC), CO, CO2, SO2, NOX, O3, and PM10. Also, meteorological parameters were measured, which includes temperature, wind speed and direction, and solar radiation. Over the study period, data analysis showed increase in measured SO2, NOX and CO by factors of 1.2, 5.5 and 2, respectively. This is explained in terms of increase in industrial activities, motor vehicle density, and power generation. Predictions of the measured data were made by the ISC-AERMOD software package and by using the ISCST3 model option. Finally, comparison was made between measured data against international standards.

Keywords: Air pollution, Emission inventory, ISCST3 model, Modeling

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2392
7194 Locating Critical Failure Surface in Rock Slope Stability with Hybrid Model Based on Artificial Immune System and Cellular Learning Automata (CLA-AIS)

Authors: Ramin Javadzadeh, Emad Javadzadeh

Abstract:

Locating the critical slip surface with the minimum factor of safety for a rock slope is a difficult problem. In recent years, some modern global optimization methods have been developed with success in treating various types of problems, but very few of such methods have been applied to rock mechanical problems. In this paper, use of hybrid model based on artificial immune system and cellular learning automata is proposed. The results show that the algorithm is an effective and efficient optimization method with a high level of confidence rate.

Keywords: CLA-AIS, failure surface, optimization methods, rock slope.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1994
7193 Space Vector Pulse Width Modulation Technique Based Design and Simulation of a Three-Phase Voltage Source Converter Systems

Authors: Farhan Beg

Abstract:

A Space Vector based Pulse Width Modulation control technique for the three-phase PWM converter is proposed in this paper. The proposed control scheme is based on a synchronous reference frame model. High performance and efficiency is obtained with regards to the DC bus voltage and the power factor considerations of the PWM rectifier thus leading to low losses. MATLAB/SIMULINK are used as a platform for the simulations and a SIMULINK model is presented in the paper. The results show that the proposed model demonstrates better performance and properties compared to the traditional SPWM method and the method improves the dynamic performance of the closed loop drastically. For the Space Vector based Pulse Width Modulation, Sine signal is the reference waveform and triangle waveform is the carrier waveform. When the value sine signal is large than triangle signal, the pulse will start produce to high. And then when the triangular signals higher than sine signal, the pulse will come to low. SPWM output will changed by changing the value of the modulation index and frequency used in this system to produce more pulse width. The more pulse width produced, the output voltage will have lower harmonics contents and the resolution increase.

Keywords: Power Factor, SVPWM, PWM rectifier, SPWM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4011
7192 Detecting Earnings Management via Statistical and Neural Network Techniques

Authors: Mohammad Namazi, Mohammad Sadeghzadeh Maharluie

Abstract:

Predicting earnings management is vital for the capital market participants, financial analysts and managers. The aim of this research is attempting to respond to this query: Is there a significant difference between the regression model and neural networks’ models in predicting earnings management, and which one leads to a superior prediction of it? In approaching this question, a Linear Regression (LR) model was compared with two neural networks including Multi-Layer Perceptron (MLP), and Generalized Regression Neural Network (GRNN). The population of this study includes 94 listed companies in Tehran Stock Exchange (TSE) market from 2003 to 2011. After the results of all models were acquired, ANOVA was exerted to test the hypotheses. In general, the summary of statistical results showed that the precision of GRNN did not exhibit a significant difference in comparison with MLP. In addition, the mean square error of the MLP and GRNN showed a significant difference with the multi variable LR model. These findings support the notion of nonlinear behavior of the earnings management. Therefore, it is more appropriate for capital market participants to analyze earnings management based upon neural networks techniques, and not to adopt linear regression models.

Keywords: Earnings management, generalized regression neural networks, linear regression, multi-layer perceptron, Tehran stock exchange.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2090
7191 An Exact Algorithm for Location–Transportation Problems in Humanitarian Relief

Authors: Chansiri Singhtaun

Abstract:

This paper proposes a mathematical model and examines the performance of an exact algorithm for a location– transportation problems in humanitarian relief. The model determines the number and location of distribution centers in a relief network, the amount of relief supplies to be stocked at each distribution center and the vehicles to take the supplies to meet the needs of disaster victims under capacity restriction, transportation and budgetary constraints. The computational experiments are conducted on the various sizes of problems that are generated. Branch and bound algorithm is applied for these problems. The results show that this algorithm can solve problem sizes of up to three candidate locations with five demand points and one candidate location with up to twenty demand points without premature termination.

Keywords: Disaster response, facility location, humanitarian relief, transportation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2370
7190 The Effects of Shot and Grit Blasting Process Parameters on Steel Pipes Coating Adhesion

Authors: Saeed Khorasanizadeh

Abstract:

Adhesion strength of exterior or interior coating of steel pipes is too important. Increasing of coating adhesion on surfaces can increase the life time of coating, safety factor of transmitting line pipe and decreasing the rate of corrosion and costs. Preparation of steel pipe surfaces before doing the coating process is done by shot and grit blasting. This is a mechanical way to do it. Some effective parameters on that process, are particle size of abrasives, distance to surface, rate of abrasive flow, abrasive physical properties, shapes, selection of abrasive, kind of machine and its power, standard of surface cleanness degree, roughness, time of blasting and weather humidity. This search intended to find some better conditions which improve the surface preparation, adhesion strength and corrosion resistance of coating. So, this paper has studied the effect of varying abrasive flow rate, changing the abrasive particle size, time of surface blasting on steel surface roughness and over blasting on it by using the centrifugal blasting machine. After preparation of numbers of steel samples (according to API 5L X52) and applying epoxy powder coating on them, to compare strength adhesion of coating by Pull-Off test. The results have shown that, increasing the abrasive particles size and flow rate, can increase the steel surface roughness and coating adhesion strength but increasing the blasting time can do surface over blasting and increasing surface temperature and hardness too, change, decreasing steel surface roughness and coating adhesion strength.

Keywords: surface preparation, abrasive particles, adhesionstrength

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9054
7189 A Functional Interpretation of Quantum Theory

Authors: Hans H. Diel

Abstract:

In this paper a functional interpretation of quantum theory (QT) with emphasis on quantum field theory (QFT) is proposed. Besides the usual statements on relations between a functions initial state and final state, a functional interpretation also contains a description of the dynamic evolution of the function. That is, it describes how things function. The proposed functional interpretation of QT/QFT has been developed in the context of the author-s work towards a computer model of QT with the goal of supporting the largest possible scope of QT concepts. In the course of this work, the author encountered a number of problems inherent in the translation of quantum physics into a computer program. He came to the conclusion that the goal of supporting the major QT concepts can only be satisfied, if the present model of QT is supplemented by a "functional interpretation" of QT/QFT. The paper describes a proposal for that

Keywords: Computability, Foundation of Quantum Mechanics, Measurement Problem, Models of Physics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2014
7188 Variational Explanation Generator: Generating Explanation for Natural Language Inference Using Variational Auto-Encoder

Authors: Zhen Cheng, Xinyu Dai, Shujian Huang, Jiajun Chen

Abstract:

Recently, explanatory natural language inference has attracted much attention for the interpretability of logic relationship prediction, which is also known as explanation generation for Natural Language Inference (NLI). Existing explanation generators based on discriminative Encoder-Decoder architecture have achieved noticeable results. However, we find that these discriminative generators usually generate explanations with correct evidence but incorrect logic semantic. It is due to that logic information is implicitly encoded in the premise-hypothesis pairs and difficult to model. Actually, logic information identically exists between premise-hypothesis pair and explanation. And it is easy to extract logic information that is explicitly contained in the target explanation. Hence we assume that there exists a latent space of logic information while generating explanations. Specifically, we propose a generative model called Variational Explanation Generator (VariationalEG) with a latent variable to model this space. Training with the guide of explicit logic information in target explanations, latent variable in VariationalEG could capture the implicit logic information in premise-hypothesis pairs effectively. Additionally, to tackle the problem of posterior collapse while training VariaztionalEG, we propose a simple yet effective approach called Logic Supervision on the latent variable to force it to encode logic information. Experiments on explanation generation benchmark—explanation-Stanford Natural Language Inference (e-SNLI) demonstrate that the proposed VariationalEG achieves significant improvement compared to previous studies and yields a state-of-the-art result. Furthermore, we perform the analysis of generated explanations to demonstrate the effect of the latent variable.

Keywords: Natural Language Inference, explanation generation, variational auto-encoder, generative model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 659
7187 CFD Simulation of Hydrodynamic Behaviors and Gas-Liquid Mass Transfer in a Stirred Airlift Bioreactor

Authors: Sérgio S. de Jesus, Edgar Leonardo Martínez, Aulus R.R. Binelli, Aline Santana, Rubens Maciel Filho

Abstract:

The speed profiles, gas holdup (eG) and global oxygen transfer coefficient (kLa) from a stirred airlift bioreactor using water as the fluid model, was investigated by computational fluid dynamics modeling. The parameters predicted by the computer model were validated with the experimental dates. The CFD results were very close to those obtained experimentally. During the simulation it was verified a prevalent impeller effect at low speeds, propelling a large volume of fluid against the walls of the vessel, which without recirculation, results in low values of eG and kLa; however, by increasing air velocity, the impeller effect is smaller with the air flow being greater, in the region of the riser, causing fluid recirculation, which explains the increase in eG and kLa.

Keywords: CFD, Hydrodynamics, Mass transfer, Stirred airlift bioreactor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3728
7186 A Fuzzy Dynamic Load Balancing Algorithm for Homogenous Distributed Systems

Authors: Ali M. Alakeel

Abstract:

Load balancing in distributed computer systems is the process of redistributing the work load among processors in the system to improve system performance. Most of previous research in using fuzzy logic for the purpose of load balancing has only concentrated in utilizing fuzzy logic concepts in describing processors load and tasks execution length. The responsibility of the fuzzy-based load balancing process itself, however, has not been discussed and in most reported work is assumed to be performed in a distributed fashion by all nodes in the network. This paper proposes a new fuzzy dynamic load balancing algorithm for homogenous distributed systems. The proposed algorithm utilizes fuzzy logic in dealing with inaccurate load information, making load distribution decisions, and maintaining overall system stability. In terms of control, we propose a new approach that specifies how, when, and by which node the load balancing is implemented. Our approach is called Centralized-But-Distributed (CBD).

Keywords: Dynamic load balancing, fuzzy logic, distributed systems, algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2433
7185 Evaluation of Internet Anxiety in SRBIAU Higher Education Students in Research Process

Authors: Nima Babazadeh Gashti, Nazanin Pilevari

Abstract:

Increase in using internet makes some problems that one of them is "internet anxiety". Internet anxiety is a type of anxious that people may feel during surfing internet or using internet for their educational purpose, blogging or streaming to digital libraries. The goal of this study is evaluating of internet anxiety among the management students. In this research Ealy's internet anxiety questionnaire, consists of positive and negative items, is completed by 310 participants. According to the findings, about 64.7% of them were equal or below to mean anxiety score (50). The distribution of internet anxiety scores was normal and there was no meaningful difference between men-s and women's anxiety level in this sample. Results also showed that there is no meaningful difference of internet anxiety level between different fields of study in Management. This evaluation will help managers to perform gap analysis between the existent level and the desired one. Future work would be providing techniques for abating human anxiety while using internet via human computer interaction techniques.

Keywords: Internet, anxiety, research process, internet identification, human computer interaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2019
7184 Migration from Commercial to in-House Developed Learning Management Systems

Authors: Lejla A. Bexheti, Visar S. Shehu, Adrian A. Besimi

Abstract:

The Learning Management Systems present learning environment which offers a collection of e-learning tools in a package that allows a common interface and information sharing among the tools. South East European University initial experience in LMS was with the usage of the commercial LMS-ANGEL. After a three year experience on ANGEL usage because of expenses that were very high it was decided to develop our own software. As part of the research project team for the in-house design and development of the new LMS, we primarily had to select the features that would cover our needs and also comply with the actual trends in the area of software development, and then design and develop the system. In this paper we present the process of LMS in-house development for South East European University, its architecture, conception and strengths with a special accent on the process of migration and integration with other enterprise applications.

Keywords: e-learning tools, LMS, migration, user feedback.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1627
7183 Towards an Enhanced Stochastic Simulation Model for Risk Analysis in Highway Construction

Authors: Anshu Manik, William G. Buttlar, Kasthurirangan Gopalakrishnan

Abstract:

Over the years, there is a growing trend towards quality-based specifications in highway construction. In many Quality Control/Quality Assurance (QC/QA) specifications, the contractor is primarily responsible for quality control of the process, whereas the highway agency is responsible for testing the acceptance of the product. A cooperative investigation was conducted in Illinois over several years to develop a prototype End-Result Specification (ERS) for asphalt pavement construction. The final characteristics of the product are stipulated in the ERS and the contractor is given considerable freedom in achieving those characteristics. The risk for the contractor or agency depends on how the acceptance limits and processes are specified. Stochastic simulation models are very useful in estimating and analyzing payment risk in ERS systems and these form an integral part of the Illinois-s prototype ERS system. This paper describes the development of an innovative methodology to estimate the variability components in in-situ density, air voids and asphalt content data from ERS projects. The information gained from this would be crucial in simulating these ERS projects for estimation and analysis of payment risks associated with asphalt pavement construction. However, these methods require at least two parties to conduct tests on all the split samples obtained according to the sampling scheme prescribed in present ERS implemented in Illinois.

Keywords: Asphalt Pavement, Risk Analysis, StochasticSimulation, QC/QA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1496
7182 Correlation of Viscosity in Nanofluids using Genetic Algorithm-neural Network (GA-NN)

Authors: Hajir Karimi, Fakheri Yousefi, Mahmood Reza Rahimi

Abstract:

An accurate and proficient artificial neural network (ANN) based genetic algorithm (GA) is developed for predicting of nanofluids viscosity. A genetic algorithm (GA) is used to optimize the neural network parameters for minimizing the error between the predictive viscosity and the experimental one. The experimental viscosity in two nanofluids Al2O3-H2O and CuO-H2O from 278.15 to 343.15 K and volume fraction up to 15% were used from literature. The result of this study reveals that GA-NN model is outperform to the conventional neural nets in predicting the viscosity of nanofluids with mean absolute relative error of 1.22% and 1.77% for Al2O3-H2O and CuO-H2O, respectively. Furthermore, the results of this work have also been compared with others models. The findings of this work demonstrate that the GA-NN model is an effective method for prediction viscosity of nanofluids and have better accuracy and simplicity compared with the others models.

Keywords: genetic algorithm, nanofluids, neural network, viscosity

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2065
7181 Performance Prediction of Multi-Agent Based Simulation Applications on the Grid

Authors: Dawit Mengistu, Lars Lundberg, Paul Davidsson

Abstract:

A major requirement for Grid application developers is ensuring performance and scalability of their applications. Predicting the performance of an application demands understanding its specific features. This paper discusses performance modeling and prediction of multi-agent based simulation (MABS) applications on the Grid. An experiment conducted using a synthetic MABS workload explains the key features to be included in the performance model. The results obtained from the experiment show that the prediction model developed for the synthetic workload can be used as a guideline to understand to estimate the performance characteristics of real world simulation applications.

Keywords: Grid computing, Performance modeling, Performance prediction, Multi-agent simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1432
7180 Mathematical Modelling of Single Phase Unity Power Factor Boost Converter

Authors: Sanjay L. Kurkute, Pradeep M. Patil, Kakasaheb C. Mohite

Abstract:

An optimal control strategy based on simple model, a single phase unity power factor boost converter is presented with an evaluation of first order differential equations. This paper presents an evaluation of single phase boost converter having power factor correction. The simple discrete model of boost converter is formed and optimal control is obtained, digital PI is adopted to adjust control error. The method of instantaneous current control is proposed in this paper for its good tracking performance of dynamic response. The simulation and experimental results verified our design.

Keywords: Single phase, boost converter, Power factor correction (PFC), Pulse Width Modulation (PWM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3430
7179 Acceptance of Mobile Learning: a Respecification and Validation of Information System Success

Authors: Chin-Cheh Yi, Pei-Wen Liao, Chin-Feng Huang, I-Hui Hwang

Abstract:

With the proliferation of mobile computing technology, mobile learning (m-learning) will play a vital role in the rapidly growing electronic learning market. However, the acceptance of m-learning by individuals is critical to the successful implementation of m-learning systems. Thus, there is a need to research the factors that affect users- intention to use m-learning. Based on an updated information system (IS) success model, data collected from 350 respondents in Taiwan were tested against the research model using the structural equation modeling approach. The data collected by questionnaire were analyzed to check the validity of constructs. Then hypotheses describing the relationships between the identified constructs and users- satisfaction were formulated and tested.

Keywords: m-learning, information system success, users' satisfaction, perceived value.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1953
7178 Using Data Mining in Automotive Safety

Authors: Carine Cridelich, Pablo Juesas Cano, Emmanuel Ramasso, Noureddine Zerhouni, Bernd Weiler

Abstract:

Safety is one of the most important considerations when buying a new car. While active safety aims at avoiding accidents, passive safety systems such as airbags and seat belts protect the occupant in case of an accident. In addition to legal regulations, organizations like Euro NCAP provide consumers with an independent assessment of the safety performance of cars and drive the development of safety systems in automobile industry. Those ratings are mainly based on injury assessment reference values derived from physical parameters measured in dummies during a car crash test. The components and sub-systems of a safety system are designed to achieve the required restraint performance. Sled tests and other types of tests are then carried out by car makers and their suppliers to confirm the protection level of the safety system. A Knowledge Discovery in Databases (KDD) process is proposed in order to minimize the number of tests. The KDD process is based on the data emerging from sled tests according to Euro NCAP specifications. About 30 parameters of the passive safety systems from different data sources (crash data, dummy protocol) are first analysed together with experts opinions. A procedure is proposed to manage missing data and validated on real data sets. Finally, a procedure is developed to estimate a set of rough initial parameters of the passive system before testing aiming at reducing the number of tests.

Keywords: KDD process, passive safety systems, sled test, dummy injury assessment reference values, frontal impact

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2829
7177 FEM Models of Glued Laminated Timber Beams Enhanced by Bayesian Updating of Elastic Moduli

Authors: L. Melzerová, T. Janda, M. Šejnoha, J. Šejnoha

Abstract:

Two finite element (FEM) models are presented in this paper to address the random nature of the response of glued timber structures made of wood segments with variable elastic moduli evaluated from 3600 indentation measurements. This total database served to create the same number of ensembles as was the number of segments in the tested beam. Statistics of these ensembles were then assigned to given segments of beams and the Latin Hypercube Sampling (LHS) method was called to perform 100 simulations resulting into the ensemble of 100 deflections subjected to statistical evaluation. Here, a detailed geometrical arrangement of individual segments in the laminated beam was considered in the construction of two-dimensional FEM model subjected to in fourpoint bending to comply with the laboratory tests. Since laboratory measurements of local elastic moduli may in general suffer from a significant experimental error, it appears advantageous to exploit the full scale measurements of timber beams, i.e. deflections, to improve their prior distributions with the help of the Bayesian statistical method. This, however, requires an efficient computational model when simulating the laboratory tests numerically. To this end, a simplified model based on Mindlin’s beam theory was established. The improved posterior distributions show that the most significant change of the Young’s modulus distribution takes place in laminae in the most strained zones, i.e. in the top and bottom layers within the beam center region. Posterior distributions of moduli of elasticity were subsequently utilized in the 2D FEM model and compared with the original simulations.

Keywords: Bayesian inference, FEM, four point bending test, laminated timber, parameter estimation, prior and posterior distribution, Young’s modulus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2196