Search results for: random effects model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10055

Search results for: random effects model

6545 Implementation the Average Input Current Mode Control of Two-Phase Interleaved Boost Converter Using Low-Cost Microcontroller

Authors: Yin Yin Phyo, Tun Lin Naing

Abstract:

In this paper, the average input current mode control is proposed for two-phase interleaved boost converter with two separate input inductors operating in continuous conduction mode (CCM). The required mathematical model is obtained from the equivalent circuits of its different four modes of operation. The small ripple approximation is derived to find the transfer functions from dynamic model using switching function. In average input current mode control, the inner current loop and outer voltage loop are designed with PI controller using bode analysis. Anti-windup structure is applied for PI controllers in control system. Moreover, the simulation work is carried out by MATLAB/Simulink. And, the hardware prototype is implemented by using low-cost microcontroller Arduino Nano. Finally, the laboratory prototype, available from the local market, is constructed to validate the mathematical model. The results show that the output voltage response is the faster rise time and settling time with acceptable overshoot.

Keywords: Average input current mode control, interleaved boost converter, low-cost microcontroller, PI controller, switching function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1353
6544 Random Projections for Dimensionality Reduction in ICA

Authors: Sabrina Gaito, Andrea Greppi, Giuliano Grossi

Abstract:

In this paper we present a technique to speed up ICA based on the idea of reducing the dimensionality of the data set preserving the quality of the results. In particular we refer to FastICA algorithm which uses the Kurtosis as statistical property to be maximized. By performing a particular Johnson-Lindenstrauss like projection of the data set, we find the minimum dimensionality reduction rate ¤ü, defined as the ratio between the size k of the reduced space and the original one d, which guarantees a narrow confidence interval of such estimator with high confidence level. The derived dimensionality reduction rate depends on a system control parameter β easily computed a priori on the basis of the observations only. Extensive simulations have been done on different sets of real world signals. They show that actually the dimensionality reduction is very high, it preserves the quality of the decomposition and impressively speeds up FastICA. On the other hand, a set of signals, on which the estimated reduction rate is greater than 1, exhibits bad decomposition results if reduced, thus validating the reliability of the parameter β. We are confident that our method will lead to a better approach to real time applications.

Keywords: Independent Component Analysis, FastICA algorithm, Higher-order statistics, Johnson-Lindenstrauss lemma.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1890
6543 Structural Reliability of Existing Structures: A Case Study

Authors: Z. Sakka, I. Assakkaf, T. Al-Yaqoub, J. Parol

Abstract:

reliability-based methodology for the assessment and evaluation of reinforced concrete (R/C) structural elements of concrete structures is presented herein. The results of the reliability analysis and assessment for R/C structural elements were verified by the results obtained through deterministic methods. The outcomes of the reliability-based analysis were compared against currently adopted safety limits that are incorporated in the reliability indices β’s, according to international standards and codes. The methodology is based on probabilistic analysis using reliability concepts and statistics of the main random variables that are relevant to the subject matter, and for which they are to be used in the performance-function equation(s) associated with the structural elements under study. These methodology techniques can result in reliability index β, which is commonly known as the reliability index or reliability measure value that can be utilized to assess and evaluate the safety, human risk, and functionality of the structural component. Also, these methods can result in revised partial safety factor values for certain target reliability indices that can be used for the purpose of redesigning the R/C elements of the building and in which they could assist in considering some other remedial actions to improve the safety and functionality of the member.

Keywords: Concrete Structures, FORM, Monte Carlo Simulation, Structural Reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3093
6542 Constraint Active Contour Model with Application to Automated Three-Dimensional Airway Wall Segmentation

Authors: Kuo-Lung Lor, Chi-Hsuan Tsou, Yeun-Chung Chang, Chung-Ming Chen

Abstract:

For evaluating the severity of Chronic Obstructive Pulmonary Disease (COPD), one is interested in inspecting the airway wall thickening due to inflammation. Although airway segmentations have being well developed to reconstruct in high order, airway wall segmentation remains a challenge task. While tackling such problem as a multi-surface segmentation, the interrelation within surfaces needs to be considered. We propose a new method for three-dimensional airway wall segmentation using spring structural active contour model. The method incorporates the gravitational field of the image and repelling force field of the inner lumen as the soft constraint and the geometric spring structure of active contour as the hard constraint to approximate a three-dimensional coupled surface readily for thickness measurements. The results show the preservation of topology constraints of coupled surfaces. In conclusion, our springy, soft-tissue-like structure ensures the globally optimal solution and waives the shortness following by the inevitable improper inner surface constraint.

Keywords: active contour model, airway wall, COPD, geometric spring structure

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1577
6541 Home Network-Specific RBAC Model

Authors: Geon-Woo Kim, Do-Woo Kim, Jun-Ho Lee, Jin-Beon Hwang, Jong-Wook Han

Abstract:

As various mobile sensing technologies, remote control and ubiquitous infrastructure are developing and expectations on quality of life are increasing, a lot of researches and developments on home network technologies and services are actively on going, Until now, we have focused on how to provide users with high-level home network services, while not many researches on home network security for guaranteeing safety are progressing. So, in this paper, we propose an access control model specific to home network that provides various kinds of users with home network services up one-s characteristics and features, and protects home network systems from illegal/unnecessary accesses or intrusions.

Keywords: Home network security, RBAC, access control, authentication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1732
6540 Palmprint based Cancelable Biometric Authentication System

Authors: Ying-Han Pang, Andrew Teoh Beng Jin, David Ngo Chek Ling

Abstract:

A cancelable palmprint authentication system proposed in this paper is specifically designed to overcome the limitations of the contemporary biometric authentication system. In this proposed system, Geometric and pseudo Zernike moments are employed as feature extractors to transform palmprint image into a lower dimensional compact feature representation. Before moment computation, wavelet transform is adopted to decompose palmprint image into lower resolution and dimensional frequency subbands. This reduces the computational load of moment calculation drastically. The generated wavelet-moment based feature representation is used to generate cancelable verification key with a set of random data. This private binary key can be canceled and replaced. Besides that, this key also possesses high data capture offset tolerance, with highly correlated bit strings for intra-class population. This property allows a clear separation of the genuine and imposter populations, as well as zero Equal Error Rate achievement, which is hardly gained in the conventional biometric based authentication system.

Keywords: Cancelable biometric authenticator, Discrete- Hashing, Moments, Palmprint.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1565
6539 Power System Security Constrained Economic Dispatch Using Real Coded Quantum Inspired Evolution Algorithm

Authors: A. K. Al-Othman, F. S. Al-Fares, K. M. EL-Nagger

Abstract:

This paper presents a new optimization technique based on quantum computing principles to solve a security constrained power system economic dispatch problem (SCED). The proposed technique is a population-based algorithm, which uses some quantum computing elements in coding and evolving groups of potential solutions to reach the optimum following a partially directed random approach. The SCED problem is formulated as a constrained optimization problem in a way that insures a secure-economic system operation. Real Coded Quantum-Inspired Evolution Algorithm (RQIEA) is then applied to solve the constrained optimization formulation. Simulation results of the proposed approach are compared with those reported in literature. The outcome is very encouraging and proves that RQIEA is very applicable for solving security constrained power system economic dispatch problem (SCED).

Keywords: State Estimation, Fuzzy Linear Regression, FuzzyLinear State Estimator (FLSE) and Measurements Uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715
6538 Quantifying Uncertainties in an Archetype-Based Building Stock Energy Model by Use of Individual Building Models

Authors: Morten Brøgger, Kim Wittchen

Abstract:

Focus on reducing energy consumption in existing buildings at large scale, e.g. in cities or countries, has been increasing in recent years. In order to reduce energy consumption in existing buildings, political incentive schemes are put in place and large scale investments are made by utility companies. Prioritising these investments requires a comprehensive overview of the energy consumption in the existing building stock, as well as potential energy-savings. However, a building stock comprises thousands of buildings with different characteristics making it difficult to model energy consumption accurately. Moreover, the complexity of the building stock makes it difficult to convey model results to policymakers and other stakeholders. In order to manage the complexity of the building stock, building archetypes are often employed in building stock energy models (BSEMs). Building archetypes are formed by segmenting the building stock according to specific characteristics. Segmenting the building stock according to building type and building age is common, among other things because this information is often easily available. This segmentation makes it easy to convey results to non-experts. However, using a single archetypical building to represent all buildings in a segment of the building stock is associated with loss of detail. Thermal characteristics are aggregated while other characteristics, which could affect the energy efficiency of a building, are disregarded. Thus, using a simplified representation of the building stock could come at the expense of the accuracy of the model. The present study evaluates the accuracy of a conventional archetype-based BSEM that segments the building stock according to building type- and age. The accuracy is evaluated in terms of the archetypes’ ability to accurately emulate the average energy demands of the corresponding buildings they were meant to represent. This is done for the buildings’ energy demands as a whole as well as for relevant sub-demands. Both are evaluated in relation to the type- and the age of the building. This should provide researchers, who use archetypes in BSEMs, with an indication of the expected accuracy of the conventional archetype model, as well as the accuracy lost in specific parts of the calculation, due to use of the archetype method.

Keywords: Building stock energy modelling, energy-savings, archetype.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 747
6537 Adsorption Kinetics of Alcohols over MCM-41 Materials

Authors: Farouq Twaiq, Mustafa Nasser, Siham Al-Hajri, Mansoor Al-Hasani

Abstract:

Adsorption of methanol and ethanol over mesoporous siliceous material are studied in the current paper. The pure mesoporous silica is prepared using tetraethylorthosilicate (TEOS) as silica source and dodecylamine as template at low pH. The prepared material was characterized using nitrogen adsorption,nX-ray diffraction (XRD) and scanning electron microscopy (SEM). The adsorption kinetics of methanol and ethanol from aqueous solution were studied over the prepared mesoporous silica material. The percent removal of alcohol was calculated per unit mass of adsorbent used. The 1st order model is found to be in agreement with both adsorbates while the 2nd order model fit the adsorption of methanol only.

Keywords: Adsorption, Kinetics, Mesoprous silica, Methanol

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2204
6536 Environmental Decision Making Model for Assessing On-Site Performances of Building Subcontractors

Authors: Buket Metin

Abstract:

Buildings cause a variety of loads on the environment due to activities performed at each stage of the building life cycle. Construction is the first stage that affects both the natural and built environments at different steps of the process, which can be defined as transportation of materials within the construction site, formation and preparation of materials on-site and the application of materials to realize the building subsystems. All of these steps require the use of technology, which varies based on the facilities that contractors and subcontractors have. Hence, environmental consequences of the construction process should be tackled by focusing on construction technology options used in every step of the process. This paper presents an environmental decision-making model for assessing on-site performances of subcontractors based on the construction technology options which they can supply. First, construction technologies, which constitute information, tools and methods, are classified. Then, environmental performance criteria are set forth related to resource consumption, ecosystem quality, and human health issues. Finally, the model is developed based on the relationships between the construction technology components and the environmental performance criteria. The Fuzzy Analytical Hierarchy Process (FAHP) method is used for weighting the environmental performance criteria according to environmental priorities of decision-maker(s), while the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method is used for ranking on-site environmental performances of subcontractors using quantitative data related to the construction technology components. Thus, the model aims to provide an insight to decision-maker(s) about the environmental consequences of the construction process and to provide an opportunity to improve the overall environmental performance of construction sites.

Keywords: Construction process, construction technology, decision making, environmental performance, subcontractors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1172
6535 Research on IBR-Driven Distributed Collaborative Visualization System

Authors: Yin Runmin, Song Changfeng

Abstract:

Image-based Rendering(IBR) techniques recently reached in broad fields which leads to a critical challenge to build up IBR-Driven visualization platform where meets requirement of high performance, large bounds of distributed visualization resource aggregation and concentration, multiple operators deploying and CSCW design employing. This paper presents an unique IBR-based visualization dataflow model refer to specific characters of IBR techniques and then discusses prominent feature of IBR-Driven distributed collaborative visualization (DCV) system before finally proposing an novel prototype. The prototype provides a well-defined three level modules especially work as Central Visualization Server, Local Proxy Server and Visualization Aid Environment, by which data and control for collaboration move through them followed the previous dataflow model. With aid of this triple hierarchy architecture of that, IBR oriented application construction turns to be easy. The employed augmented collaboration strategy not only achieve convenient multiple users synchronous control and stable processing management, but also is extendable and scalable.

Keywords: Image-Based Rendering, Distributed CollaborativeVisualization, Computer Supported Cooperative Work, Model andSimulation, Modular Visualization Environment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1486
6534 A New Approach In Protein Folding Studies Revealed The Potential Site For Nucleation Center

Authors: Nurul Bahiyah Ahmad Khairudin, Habibah A Wahab

Abstract:

A new approach to predict the 3D structures of proteins by combining the knowledge-based method and Molecular Dynamics Simulation is presented on the chicken villin headpiece subdomain (HP-36). Comparative modeling is employed as the knowledge-based method to predict the core region (Ala9-Asn28) of the protein while the remaining residues are built as extended regions (Met1-Lys8; Leu29-Phe36) which then further refined using Molecular Dynamics Simulation for 120 ns. Since the core region is built based on a high sequence identity to the template (65%) resulting in RMSD of 1.39 Å from the native, it is believed that this well-developed core region can act as a 'nucleation center' for subsequent rapid downhill folding. Results also demonstrate that the formation of the non-native contact which tends to hamper folding rate can be avoided. The best 3D model that exhibits most of the native characteristics is identified using clustering method which then further ranked based on the conformational free energies. It is found that the backbone RMSD of the best model compared to the NMR-MDavg is 1.01 Å and 3.53 Å, for the core region and the complete protein, respectively. In addition to this, the conformational free energy of the best model is lower by 5.85 kcal/mol as compared to the NMR-MDavg. This structure prediction protocol is shown to be effective in predicting the 3D structure of small globular protein with a considerable accuracy in much shorter time compared to the conventional Molecular Dynamics simulation alone.

Keywords: 3D model, Chicken villin headpiece subdomain, Molecular dynamic simulation NMR-MDavg, RMSD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1549
6533 Quantifying the Second-Level Digital Divide on Sub-National Level

Authors: Vladimir Korovkin, Albert Park, Evgeny Kaganer

Abstract:

Digital divide, the gap in the access to the world of digital technologies and the socio-economic opportunities that they create is an important phenomenon of the XXI century. This gap may exist between countries, regions within a country or socio-demographic groups, creating the classes of “digital have and have nots”. While the 1st-level divide (the difference in opportunities to access the digital networks) was demonstrated to diminish with time, the issues of 2nd level divide (the difference in skills and usage of digital systems) and 3rd level divide (the difference in effects obtained from digital technology) may grow. The paper offers a systemic review of literature on the measurement of the digital divide, noting the certain conceptual stagnation due to the lack of effective instruments that would capture the complex nature of the phenomenon. As a result, many important concepts do not receive the empiric exploration they deserve. As a solution the paper suggests a composite Digital Life Index, that studies separately the digital supply and demand across seven independent dimensions providing for 14 subindices. The Index is based on Internet-borne data, a distinction from traditional research approaches that rely on official statistics or surveys. The application of the model to the study of the digital divide between Russian regions and between cities in China have brought promising results. The paper advances the existing methodological literature on the 2nd level digital divide and can also inform practical decision-making regarding the strategies of national and regional digital development.

Keywords: Digital transformation, second-level digital divide, composite index, digital policy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 464
6532 A TFETI Domain Decompositon Solver for Von Mises Elastoplasticity Model with Combination of Linear Isotropic-Kinematic Hardening

Authors: Martin Cermak, Stanislav Sysala

Abstract:

In this paper we present the efficient parallel implementation of elastoplastic problems based on the TFETI (Total Finite Element Tearing and Interconnecting) domain decomposition method. This approach allow us to use parallel solution and compute this nonlinear problem on the supercomputers and decrease the solution time and compute problems with millions of DOFs. In our approach we consider an associated elastoplastic model with the von Mises plastic criterion and the combination of linear isotropic-kinematic hardening law. This model is discretized by the implicit Euler method in time and by the finite element method in space. We consider the system of nonlinear equations with a strongly semismooth and strongly monotone operator. The semismooth Newton method is applied to solve this nonlinear system. Corresponding linearized problems arising in the Newton iterations are solved in parallel by the above mentioned TFETI. The implementation of this problem is realized in our in-house MatSol packages developed in MatLab.

Keywords: Isotropic-kinematic hardening, TFETI, domain decomposition, parallel solution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1759
6531 Vibration Attenuation Using Functionally Graded Material

Authors: Saeed Asiri, Hassan Hedia, Wael Eissa

Abstract:

The aim of the work was to attenuate the vibration amplitude in CESNA 172 airplane wing by using Functionally Graded Material instead of uniform or composite material. Wing strength was achieved by means of stress analysis study, while wing vibration amplitudes and shapes were achieved by means of Modal and Harmonic analysis. Results were verified by applying the methodology in a simple cantilever plate to the simple model and the results were promising and the same methodology can be applied to the airplane wing model. Aluminum models, Titanium models, and functionally graded materials of Aluminum and titanium results were compared to show a great vibration attenuation after using the FGM. Optimization in FGM gradation satisfied our objective of reducing and attenuating the vibration amplitudes to show the effect of using FGM in vibration behavior. Testing the Aluminum rich models, and comparing it with the titanium rich model was an optimization in this paper. Results have shown a significant attenuation in vibration magnitudes when using FGM instead of Titanium Plate, and Aluminium wing with FGM Spurs instead of Aluminium wings. It was also recommended that in future, changing the graphical scale to 1:10 or even 1:1 when the computers- capabilities allow.

Keywords: Vibration, Attenuation, FGM, ANSYS2011, FEM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3134
6530 Lattice Boltzmann Simulation of the Carbonization of Wood Particle

Authors: Ahmed Mahmoudi, Imen Mejri, Mohamed A. Abbassi, Ahmed Omri

Abstract:

A numerical study based on the Lattice Boltzmann Method (LBM) is proposed to solve one, two and three dimensional heat and mass transfer for isothermal carbonization of thick wood particles. To check the validity of the proposed model, computational results have been compared with the published data and a good agreement is obtained. Then, the model is used to study the effect of reactor temperature and thermal boundary conditions, on the evolution of the local temperature and the mass distributions of the wood particle during carbonization

Keywords: Lattice Boltzmann Method, pyrolysis conduction, carbonization, Heat and mass transfer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2707
6529 Aspect-Level Sentiment Analysis with Multi-Channel and Graph Convolutional Networks

Authors: Jiajun Wang, Xiaoge Li

Abstract:

The purpose of the aspect-level sentiment analysis task is to identify the sentiment polarity of aspects in a sentence. Currently, most methods mainly focus on using neural networks and attention mechanisms to model the relationship between aspects and context, but they ignore the dependence of words in different ranges in the sentence, resulting in deviation when assigning relationship weight to other words other than aspect words. To solve these problems, we propose an aspect-level sentiment analysis model that combines a multi-channel convolutional network and graph convolutional network (GCN). Firstly, the context and the degree of association between words are characterized by Long Short-Term Memory (LSTM) and self-attention mechanism. Besides, a multi-channel convolutional network is used to extract the features of words in different ranges. Finally, a convolutional graph network is used to associate the node information of the dependency tree structure. We conduct experiments on four benchmark datasets. The experimental results are compared with those of other models, which shows that our model is better and more effective.

Keywords: Aspect-level sentiment analysis, attention, multi-channel convolution network, graph convolution network, dependency tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 507
6528 Dynamic Modeling and Simulation of Industrial Naphta Reforming Reactor

Authors: Gholamreza Zahedi, M. Tarin, M. Biglari

Abstract:

This work investigated the steady state and dynamic simulation of a fixed bed industrial naphtha reforming reactors. The performance of the reactor was investigated using a heterogeneous model. For process simulation, the differential equations are solved using the 4th order Runge-Kutta method .The models were validated against measured process data of an existing naphtha reforming plant. The results of simulation in terms of components yields and temperature of the outlet were in good agreement with empirical data. The simple model displays a useful tool for dynamic simulation, optimization and control of naphtha reforming.

Keywords: Dynamic simulation, fixed bed reactor, modeling, reforming

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2966
6527 Design and Analysis of MEMS based Accelerometer for Automatic Detection of Railway Wheel Flat

Authors: Rajib Ul Alam Uzzal, Ion Stiharu, Waiz Ahmed

Abstract:

This paper presents the modeling of a MEMS based accelerometer in order to detect the presence of a wheel flat in the railway vehicle. A haversine wheel flat is assigned to one wheel of a 5 DOF pitch plane vehicle model, which is coupled to a 3 layer track model. Based on the simulated acceleration response obtained from the vehicle-track model, an accelerometer is designed that meets all the requirements to detect the presence of a wheel flat. The proposed accelerometer can survive in a dynamic shocking environment with acceleration up to ±150g. The parameters of the accelerometer are calculated in order to achieve the required specifications using lumped element approximation and the results are used for initial design layout. A finite element analysis code (COMSOL) is used to perform simulations of the accelerometer under various operating conditions and to determine the optimum configuration. The simulated results are found within about 2% of the calculated values, which indicates the validity of lumped element approach. The stability of the accelerometer is also determined in the desired range of operation including the condition under shock.

Keywords: MEMS accelerometer, Pitch plane vehicle, wheel flat.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3073
6526 Effects of Gamification on Lower Secondary School Students’ Motivation and Engagement

Authors: Goh Yung Hong, Mona Masood

Abstract:

This paper explores the effects of gamification on lower secondary school students’ motivation and engagement in the classroom. Two-group posttest-only experimental design were employed to study the influence of gamification teaching method (GTM) when compared with conventional teaching method (CTM) on 60 lower secondary school students. The Student Engagement Instrument (SEI) and Intrinsic Motivation Inventory (IMI) were used to assess students’ intrinsic motivation and engagement level towards the respective teaching method. Finding indicates that students who completed the GTM lesson were significantly higher in intrinsic motivation to learn than those from the CTM. Although the result were insignificant and only marginal difference in the engagement mean, GTM still show better potential in raising student’s engagement in class when compared with CTM. This finding proves that the GTM is likely to solve the current issue of low motivation to learn and low engagement in class among lower secondary school students in Malaysia. On the other hand, despite being not significant, higher mean indicates that CTM positively contribute to higher peer support for learning and better teacher and student relationship when compared with GTM. As a conclusion, gamification approach is flexible and can be adapted into many learning content to enhance the intrinsic motivation to learn and to some extent, encourage better student engagement in class.

Keywords: Conventional teaching method, Gamification teaching method, Motivation, Engagement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5811
6525 Numerical Simulation of Natural Gas Dispersion from Low Pressure Pipelines

Authors: Omid Adibi, Nategheh Najafpour, Bijan Farhanieh, Hossein Afshin

Abstract:

Gas release from the pipelines is one of the main factors in the gas industry accidents. Released gas ejects from the pipeline as a free jet and in the growth process, the fuel gets mixed with the ambient air. Accordingly, an accidental spark will release the chemical energy of the mixture with an explosion. Gas explosion damages the equipment and endangers the life of staffs. So due to importance of safety in gas industries, prevision of accident can reduce the number of the casualties. In this paper, natural gas leakages from the low pressure pipelines are studied in two steps: 1) the simulation of mixing process and identification of flammable zones and 2) the simulation of wind effects on the mixing process. The numerical simulations were performed by using the finite volume method and the pressure-based algorithm. Also, for the grid generation the structured method was used. The results show that, in just 6.4 s after accident, released natural gas could penetrate to 40 m in vertical and 20 m in horizontal direction. Moreover, the results show that the wind speed is a key factor in dispersion process. In fact, the wind transports the flammable zones into the downstream. Hence, to improve the safety of the people and human property, it is preferable to construct gas facilities and buildings in the opposite side of prevailing wind direction.

Keywords: Flammable zones, gas pipelines, numerical simulation, wind effects.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1152
6524 Utilizing Dutch Auction in an Agent-based Model E-commerce System

Authors: Costin Badica, Maria Ganzha, Maciej Gawinecki, Pawel Kobzdej, Marcin Paprzycki

Abstract:

Recently, we have presented an initial implementation of a model agent-based e-commerce system, which utilized a simple price negotiation mechanism–English Auction. In this note we discuss how a Dutch Auction involving multiple units of a product can be included in our system. We present UML diagrams of agents involved in price negotiations and briefly discuss rule-based mechanism exemplifying Dutch Auction.

Keywords: e-commerce, rule-based price negotiation mechanism, Dutch Auction, agent system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1742
6523 Forecast Based on an Empirical Probability Function with an Adjusted Error Using Propagation of Error

Authors: Oscar Javier Herrera, Manuel Ángel Camacho

Abstract:

This paper addresses a cutting edge method of business demand forecasting, based on an empirical probability function when the historical behavior of the data is random. Additionally, it presents error determination based on the numerical method technique ‘propagation of errors.’ The methodology was conducted characterization and process diagnostics demand planning as part of the production management, then new ways to predict its value through techniques of probability and to calculate their mistake investigated, it was tools used numerical methods. All this based on the behavior of the data. This analysis was determined considering the specific business circumstances of a company in the sector of communications, located in the city of Bogota, Colombia. In conclusion, using this application it was possible to obtain the adequate stock of the products required by the company to provide its services, helping the company reduce its service time, increase the client satisfaction rate, reduce stock which has not been in rotation for a long time, code its inventory, and plan reorder points for the replenishment of stock.

Keywords: Demand Forecasting, Empirical Distribution, Propagation of Error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1844
6522 Dynamical Analysis of a Harvesting Model of Phytoplankton-Zooplankton Interaction

Authors: Anuj K. Sharma, Amit Sharma, Kulbhushan Agnihotri

Abstract:

In this work, we propose and analyze a model of Phytoplankton-Zooplankton interaction with harvesting considering that some species are exploited commercially for food. Criteria for local stability, instability and global stability are derived and some threshold harvesting levels are explored to maintain the population at an appropriate equilibrium level even if the species are exploited continuously.Further,biological and bionomic equilibria of the system are obtained and an optimal harvesting policy is also analysed using the Pantryagin’s Maximum Principle.Finally analytical findings are also supported by some numerical simulations.

Keywords: Phytoplankton-Zooplankton, Global stability, Bionomic Equilibrium, Pontrying-Maximum Principal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2273
6521 Technological Deep Assessment of Automotive Parts Manufacturers Case of Iranian Manufacturers

Authors: Manouchehre Ansari, Mahmoud Dehghan Nayeri, Reza Yousefi Zenouz

Abstract:

In order to develop any strategy, it is essential to first identify opportunities, threats, weak and strong points. Assessment of technology level provides the possibility of concentrating on weak and strong points. The results of technology assessment have a direct effect on decision making process in the field of technology transfer or expansion of internal research capabilities so it has a critical role in technology management. This paper presents a conceptual model to analyze the technology capability of a company as a whole and in four main aspects of technology. This model was tested on 10 automotive parts manufacturers in IRAN. Using this model, capability level of manufacturers was investigated in four fields of managing aspects, hard aspects, human aspects, and information and knowledge aspects. Results show that these firms concentrate on hard aspect of technology while others aspects are poor and need to be supported more. So this industry should develop other aspects of technology as well as hard aspect to have effective and efficient use of its technology. These paper findings are useful for the technology planning and management in automotive part manufactures in IRAN and other Industries which are technology followers and transport their needed technologies.

Keywords: Technology, Technological evaluation, TechnologyMaturity

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1737
6520 A Study about the Distribution of the Spanning Ratios of Yao Graphs

Authors: Maryam Hsaini, Mostafa Nouri-Baygi

Abstract:

A critical problem in wireless sensor networks is limited battery and memory of nodes. Therefore, each node in the network could maintain only a subset of its neighbors to communicate with. This will increase the battery usage in the network because each packet should take more hops to reach its destination. In order to tackle these problems, spanner graphs are defined. Since each node has a small degree in a spanner graph and the distance in the graph is not much greater than its actual geographical distance, spanner graphs are suitable candidates to be used for the topology of a wireless sensor network. In this paper, we study Yao graphs and their behavior for a randomly selected set of points. We generate several random point sets and compare the properties of their Yao graphs with the complete graph. Based on our data sets, we obtain several charts demonstrating how Yao graphs behave for a set of randomly chosen point set. As the results show, the stretch factor of a Yao graph follows a normal distribution. Furthermore, the stretch factor is in average far less than the worst case stretch factor proved for Yao graphs in previous results. Furthermore, we use Yao graph for a realistic point set and study its stretch factor in real world.

Keywords: Wireless sensor network, spanner graph, Yao Graph.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 597
6519 An Empirical Investigation of Montesquieu’s Theories on Climate

Authors: Lisa J. Piergallini

Abstract:

This project uses panel regression analyses to investigate the relationships between geography, institutions, and economic development, as guided by the theories of the 18th century French philosopher Montesquieu. Contemporary scholars of political economy perpetually misinterpret Montesquieu’s theories on climate, and in doing so they miss what could be the key to resolving the geography vs. institutions debate. There is a conspicuous gap in this literature, in that it does not consider whether geography and institutors might have an interactive, dynamic effect on economic development. This project seeks to bridge that gap. Data are used for all available countries over the years 1980-2013. Two interaction terms between geographic and institutional variables are employed within the empirical analyses, and these offer a unique contribution to the ongoing geography vs. institutions debate within the political economy literature. This study finds that there is indeed an interactive effect between geography and institutions, and that this interaction has a statistically significant effect on economic development. Democracy (as measured by Polity score) and rule of law and property rights (as measured by the Fraser index) have positive effects on economic development (as measured by GDP per capita), yet the magnitude of these effects are stronger in contexts where a low percent of the national population lives in the geographical tropics. This has implications for promoting economic development, and it highlights the importance of understanding geographical context.

Keywords: Montesquieu, geography, institutions, economic development, political philosophy, political economy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2688
6518 Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator

Authors: T. Jayanthi, K. Velusamy, H. Seetha, S. A. V. Satya Murty

Abstract:

Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and Validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) where in the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation & Control design team. This paper discusses about the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.

Keywords: Verification and Validation (V&V), Prototype Fast Breeder Reactor (PFBR), Kalpakkam Breeder Reactor Simulator (KALBR-SIM), Steady State, Transient State.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2519
6517 Knowledge Acquisition and Client Organisations: Case Study of a Student as Producer

Authors: Barry Ardley, Abi Hunt, Nick Taylor

Abstract:

As a theoretical and practical framework this study uses the student as producer approach to learning in higher education, as adopted by the Lincoln International Business School, University of Lincoln, UK. Student as producer positions learners as skilled and capable agents, able to participate as partners with tutors in live research projects. To illuminate the nature of this approach to learning and to highlight its critical issues, the authors report on two guided student consultancy projects. These were set up with the assistance of two local organisations in the city of Lincoln UK. Using the student as producer model to deliver the projects enabled learners to acquire and develop a range of key skills and knowledge, not easily accessible in more traditional educational settings. This paper presents a systematic case study analysis of the eight organising principles of the student as producer model, as adopted by university tutors. The experience of tutors implementing student as producer suggests that the model can be widely applied to benefit not only the learning and teaching experiences of higher education students, and staff, but additionally, a university’s research programme and its community partners.

Keywords: Experiential learning, consultancy clients, student as producer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 245
6516 Gray Level Image Encryption

Authors: Roza Afarin, Saeed Mozaffari

Abstract:

The aim of this paper is image encryption using Genetic Algorithm (GA). The proposed encryption method consists of two phases. In modification phase, pixels locations are altered to reduce correlation among adjacent pixels. Then, pixels values are changed in the diffusion phase to encrypt the input image. Both phases are performed by GA with binary chromosomes. For modification phase, these binary patterns are generated by Local Binary Pattern (LBP) operator while for diffusion phase binary chromosomes are obtained by Bit Plane Slicing (BPS). Initial population in GA includes rows and columns of the input image. Instead of subjective selection of parents from this initial population, a random generator with predefined key is utilized. It is necessary to decrypt the coded image and reconstruct the initial input image. Fitness function is defined as average of transition from 0 to 1 in LBP image and histogram uniformity in modification and diffusion phases, respectively. Randomness of the encrypted image is measured by entropy, correlation coefficients and histogram analysis. Experimental results show that the proposed method is fast enough and can be used effectively for image encryption.

Keywords: Correlation coefficients, Genetic algorithm, Image encryption, Image entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2238