Search results for: Component Model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8115

Search results for: Component Model

7785 Information Retrieval: Improving Question Answering Systems by Query Reformulation and Answer Validation

Authors: Mohammad Reza Kangavari, Samira Ghandchi, Manak Golpour

Abstract:

Question answering (QA) aims at retrieving precise information from a large collection of documents. Most of the Question Answering systems composed of three main modules: question processing, document processing and answer processing. Question processing module plays an important role in QA systems to reformulate questions. Moreover answer processing module is an emerging topic in QA systems, where these systems are often required to rank and validate candidate answers. These techniques aiming at finding short and precise answers are often based on the semantic relations and co-occurrence keywords. This paper discussed about a new model for question answering which improved two main modules, question processing and answer processing which both affect on the evaluation of the system operations. There are two important components which are the bases of the question processing. First component is question classification that specifies types of question and answer. Second one is reformulation which converts the user's question into an understandable question by QA system in a specific domain. The objective of an Answer Validation task is thus to judge the correctness of an answer returned by a QA system, according to the text snippet given to support it. For validating answers we apply candidate answer filtering, candidate answer ranking and also it has a final validation section by user voting. Also this paper described new architecture of question and answer processing modules with modeling, implementing and evaluating the system. The system differs from most question answering systems in its answer validation model. This module makes it more suitable to find exact answer. Results show that, from total 50 asked questions, evaluation of the model, show 92% improving the decision of the system.

Keywords: Answer processing, answer validation, classification, question answering, query reformulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2838
7784 Implementation of TinyHash based on Hash Algorithm for Sensor Network

Authors: HangRok Lee, YongJe Choi, HoWon Kim

Abstract:

In recent years, it has been proposed security architecture for sensor network.[2][4]. One of these, TinySec by Chris Kalof, Naveen Sastry, David Wagner had proposed Link layer security architecture, considering some problems of sensor network. (i.e : energy, bandwidth, computation capability,etc). The TinySec employs CBC_mode of encryption and CBC-MAC for authentication based on SkipJack Block Cipher. Currently, This TinySec is incorporated in the TinyOS for sensor network security. This paper introduces TinyHash based on general hash algorithm. TinyHash is the module in order to replace parts of authentication and integrity in the TinySec. it implies that apply hash algorithm on TinySec architecture. For compatibility about TinySec, Components in TinyHash is constructed as similar structure of TinySec. And TinyHash implements the HMAC component for authentication and the Digest component for integrity of messages. Additionally, we define the some interfaces for service associated with hash algorithm.

Keywords: sensor network security, nesC, TinySec, TinyOS, Hash, HMAC, integrity

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2346
7783 Lean Impact Analysis Assessment Models: Development of a Lean Measurement Structural Model

Authors: Catherine Maware, Olufemi Adetunji

Abstract:

The paper is aimed at developing a model to measure the impact of Lean manufacturing deployment on organizational performance. The model will help industry practitioners to assess the impact of implementing Lean constructs on organizational performance. It will also harmonize the measurement models of Lean performance with the house of Lean that seems to have become the industry standard. The sheer number of measurement models for impact assessment of Lean implementation makes it difficult for new adopters to select an appropriate assessment model or deployment methodology. A literature review is conducted to classify the Lean performance model. Pareto analysis is used to select the Lean constructs for the development of the model. The model is further formalized through the use of Structural Equation Modeling (SEM) in defining the underlying latent structure of a Lean system. An impact assessment measurement model developed can be used to measure Lean performance and can be adopted by different industries.

Keywords: Impact measurement model, lean bundles, lean manufacturing, organizational performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1213
7782 An Implicit Region-Based Deformable Model with Local Segmentation Applied to Weld Defects Extraction

Authors: Y. Boutiche, N. Ramou, M. Ben Gharsallah

Abstract:

This paper is devoted to present and discuss a model that allows a local segmentation by using statistical information of a given image. It is based on Chan-Vese model, curve evolution, partial differential equations and binary level sets method. The proposed model uses the piecewise constant approximation of Chan-Vese model to compute Signed Pressure Force (SPF) function, this one attracts the curve to the true object(s)-s boundaries. The implemented model is used to extract weld defects from weld radiographic images in the aim to calculate the perimeter and surfaces of those weld defects; encouraged resultants are obtained on synthetic and real radiographic images.

Keywords: Active contour, Chan-Vese Model, local segmentation, weld radiographic images.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1496
7781 Model of Optimal Centroids Approach for Multivariate Data Classification

Authors: Pham Van Nha, Le Cam Binh

Abstract:

Particle swarm optimization (PSO) is a population-based stochastic optimization algorithm. PSO was inspired by the natural behavior of birds and fish in migration and foraging for food. PSO is considered as a multidisciplinary optimization model that can be applied in various optimization problems. PSO’s ideas are simple and easy to understand but PSO is only applied in simple model problems. We think that in order to expand the applicability of PSO in complex problems, PSO should be described more explicitly in the form of a mathematical model. In this paper, we represent PSO in a mathematical model and apply in the multivariate data classification. First, PSOs general mathematical model (MPSO) is analyzed as a universal optimization model. Then, Model of Optimal Centroids (MOC) is proposed for the multivariate data classification. Experiments were conducted on some benchmark data sets to prove the effectiveness of MOC compared with several proposed schemes.

Keywords: Analysis of optimization, artificial intelligence-based optimization, optimization for learning and data analysis, global optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 898
7780 Formal Verification of Cache System Using a Novel Cache Memory Model

Authors: Guowei Hou, Lixin Yu, Wei Zhuang, Hui Qin, Xue Yang

Abstract:

Formal verification is proposed to ensure the correctness of the design and make functional verification more efficient. As cache plays a vital role in the design of System on Chip (SoC), and cache with Memory Management Unit (MMU) and cache memory unit makes the state space too large for simulation to verify, then a formal verification is presented for such system design. In the paper, a formal model checking verification flow is suggested and a new cache memory model which is called “exhaustive search model” is proposed. Instead of using large size ram to denote the whole cache memory, exhaustive search model employs just two cache blocks. For cache system contains data cache (Dcache) and instruction cache (Icache), Dcache memory model and Icache memory model are established separately using the same mechanism. At last, the novel model is employed to the verification of a cache which is module of a custom-built SoC system that has been applied in practical, and the result shows that the cache system is verified correctly using the exhaustive search model, and it makes the verification much more manageable and flexible.

Keywords: Cache system, formal verification, novel model, System on Chip (SoC).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2286
7779 The Laser Line Detection for Autonomous Mapping Based on Color Segmentation

Authors: Pavel Chmelar, Martin Dobrovolny

Abstract:

Laser projection or laser footprint detection is today widely used in many fields of robotics, measurement or electronics. The system accuracy strictly depends on precise laser footprint detection on target objects. This article deals with the laser line detection based on the RGB segmentation and the component labeling. As a measurement device was used the developed optical rangefinder. The optical rangefinder is equipped with vertical sweeping of the laser beam and high quality camera. This system was developed mainly for automatic exploration and mapping of unknown spaces. In the first section is presented a new detection algorithm. In the second section are presented measurements results. The measurements were performed in variable light conditions in interiors. The last part of the article present achieved results and their differences between day and night measurements.

Keywords: Automatic mapping, color segmentation, component labeling, distance measurement, laser line detection, vector map.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3553
7778 Automatic Generation of Ontology from Data Source Directed by Meta Models

Authors: Widad Jakjoud, Mohamed Bahaj, Jamal Bakkas

Abstract:

Through this paper we present a method for automatic generation of ontological model from any data source using Model Driven Architecture (MDA), this generation is dedicated to the cooperation of the knowledge engineering and software engineering. Indeed, reverse engineering of a data source generates a software model (schema of data) that will undergo transformations to generate the ontological model. This method uses the meta-models to validate software and ontological models.

Keywords: Meta model, model, ontology, data source.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1990
7777 Zero Inflated Strict Arcsine Regression Model

Authors: Y. N. Phang, E. F. Loh

Abstract:

Zero inflated strict arcsine model is a newly developed model which is found to be appropriate in modeling overdispersed count data. In this study, we extend zero inflated strict arcsine model to zero inflated strict arcsine regression model by taking into consideration the extra variability caused by extra zeros and covariates in count data. Maximum likelihood estimation method is used in estimating the parameters for this zero inflated strict arcsine regression model.

Keywords: Overdispersed count data, maximum likelihood estimation, simulated annealing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1749
7776 Fault Detection of Drinking Water Treatment Process Using PCA and Hotelling's T2 Chart

Authors: Joval P George, Dr. Zheng Chen, Philip Shaw

Abstract:

This paper deals with the application of Principal Component Analysis (PCA) and the Hotelling-s T2 Chart, using data collected from a drinking water treatment process. PCA is applied primarily for the dimensional reduction of the collected data. The Hotelling-s T2 control chart was used for the fault detection of the process. The data was taken from a United Utilities Multistage Water Treatment Works downloaded from an Integrated Program Management (IPM) dashboard system. The analysis of the results show that Multivariate Statistical Process Control (MSPC) techniques such as PCA, and control charts such as Hotelling-s T2, can be effectively applied for the early fault detection of continuous multivariable processes such as Drinking Water Treatment. The software package SIMCA-P was used to develop the MSPC models and Hotelling-s T2 Chart from the collected data.

Keywords: Principal component analysis, hotelling's t2 chart, multivariate statistical process control, drinking water treatment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2771
7775 Comparison of Two Interval Models for Interval-Valued Differential Evolution

Authors: Hidehiko Okada

Abstract:

The author previously proposed an extension of differential evolution. The proposed method extends the processes of DE to handle interval numbers as genotype values so that DE can be applied to interval-valued optimization problems. The interval DE can employ either of two interval models, the lower and upper model or the center and width model, for specifying genotype values. Ability of the interval DE in searching for solutions may depend on the model. In this paper, the author compares the two models to investigate which model contributes better for the interval DE to find better solutions. Application of the interval DE is evolutionary training of interval-valued neural networks. A result of preliminary study indicates that the CW model is better than the LU model: the interval DE with the CW model could evolve better neural networks. 

Keywords: Evolutionary algorithms, differential evolution, neural network, neuroevolution, interval arithmetic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1658
7774 Optimization and Feasibility Analysis of PV/Wind/ Battery Hybrid Energy Conversion

Authors: Doaa M. Atia, Faten H. Fahmy, Ninet M. Ahmed, Hassen T. Dorrah

Abstract:

In this paper, the optimum design for renewable energy system powered an aquaculture pond was determined. Hybrid Optimization Model for Electric Renewable (HOMER) software program, which is developed by U.S National Renewable Energy Laboratory (NREL), is used for analyzing the feasibility of the stand alone and hybrid system in this study. HOMER program determines whether renewable energy resources satisfy hourly electric demand or not. The program calculates energy balance for every 8760 hours in a year to simulate operation of the system. This optimization compares the demand for the electrical energy for each hour of the year with the energy supplied by the system for that hour and calculates the relevant energy flow for each component in the model. The essential principle is to minimize the total system cost while HOMER ensures control of the system. Moreover the feasibility analysis of the energy system is also studied. Wind speed, solar irradiance, interest rate and capacity shortage are the parameters which are taken into consideration. The simulation results indicate that the hybrid system is the best choice in this study, yielding lower net present cost. Thus, it provides higher system performance than PV or wind stand alone systems.

Keywords: Wind stand-alone system, Photovoltaic stand-alone system, Hybrid system, Optimum system sizing, feasibility, Cost analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2108
7773 Assessing Complexity of Neuronal Multiunit Activity by Information Theoretic Measure

Authors: Young-Seok Choi

Abstract:

This paper provides a quantitative measure of the time-varying multiunit neuronal spiking activity using an entropy based approach. To verify the status embedded in the neuronal activity of a population of neurons, the discrete wavelet transform (DWT) is used to isolate the inherent spiking activity of MUA. Due to the de-correlating property of DWT, the spiking activity would be preserved while reducing the non-spiking component. By evaluating the entropy of the wavelet coefficients of the de-noised MUA, a multiresolution Shannon entropy (MRSE) of the MUA signal is developed. The proposed entropy was tested in the analysis of both simulated noisy MUA and actual MUA recorded from cortex in rodent model. Simulation and experimental results demonstrate that the dynamics of a population can be quantified by using the proposed entropy.

Keywords: Discrete wavelet transform, Entropy, Multiresolution, Multiunit activity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1833
7772 A Martingale Residual Diagnostic for Logistic Regression Model

Authors: Entisar A. Elgmati

Abstract:

Martingale model diagnostic for assessing the fit of logistic regression model to recurrent events data are studied. One way of assessing the fit is by plotting the empirical standard deviation of the standardized martingale residual processes. Here we used another diagnostic plot based on martingale residual covariance. We investigated the plot performance under several types of model misspecification. Clearly the method has correctly picked up the wrong model. Also we present a test statistic that supplement the inspection of the two diagnostic. The test statistic power agrees with what we have seen in the plots of the estimated martingale covariance.

Keywords: Covariance, logistic model, misspecification, recurrent events.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1866
7771 A Intelligent Inference Model about Complex Systems- Stability: Inspiration from Nature

Authors: Naiqin Feng, Yuhui Qiu, Yingshan Zhang, Fang Wang

Abstract:

A logic model for analyzing complex systems- stability is very useful to many areas of sciences. In the real world, we are enlightened from some natural phenomena such as “biosphere", “food chain", “ecological balance" etc. By research and practice, and taking advantage of the orthogonality and symmetry defined by the theory of multilateral matrices, we put forward a logic analysis model of stability of complex systems with three relations, and prove it by means of mathematics. This logic model is usually successful in analyzing stability of a complex system. The structure of the logic model is not only clear and simple, but also can be easily used to research and solve many stability problems of complex systems. As an application, some examples are given.

Keywords: Complex system, logic model, relation, stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1318
7770 An Adverse Model for Price Discrimination in the Case of Monopoly

Authors: Daniela Elena Marinescu, Ioana Manafi, Dumitru Marin

Abstract:

We consider a Principal-Agent model with the Principal being a seller who does not know perfectly how much the buyer (the Agent) is willing to pay for the good. The buyer-s preferences are hence his private information. The model corresponds to the nonlinear pricing problem of Maskin and Riley. We assume there are three types of Agents. The model is solved using “informational rents" as variables. In the last section we present the main characteristics of the optimal contracts in asymmetric information and some possible extensions of the model.

Keywords: Adverse selection, asymmetric information, informational rent, nonlinear pricing, optimal contract

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1341
7769 A Convenient Model for I-V Characteristic of a Solar Cell Generator as an Active Two-Pole with Self-Limitation of Current

Authors: A. A. Penin, A. S. Sidorenko

Abstract:

A convenient and physically sound mathematical model of the external or I - V characteristic of solar cells generators is presented in this paper. This model is compared with the traditional model of p-n junction. The direct analytical calculation of load regime leads to a quadratic equation, which is importantly to simplify the calculations in the real time.

Keywords: A solar cell generator, I−V characteristic, activetwo-pole.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1261
7768 Analytical Model for Predicting Whole Building Heat Transfer

Authors: Xiaoshu Lu, Martti Viljanen

Abstract:

A new analytical model is developed which provides close-formed solutions for both transient indoor and envelope temperature changes in buildings. Time-dependent boundary temperature is presented as Fourier series which can approximate real weather conditions. The final close-formed solutions are simple, concise, and comprehensive. The model was compared with numerical results and good accuracy was obtained. The model can be used as design and control guidelines in engineering applications for analysing mechanical heat transfer properties for buildings.

Keywords: Analytical model, heat transfer, whole building.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2035
7767 The Implicit Methods for the Study of Tolerance

Authors: M. Bambulyakа

Abstract:

Tolerance is a tool for achieving a social cohesion, particularly, among individuals and groups with different values. The aim is to study the characteristics of the ethnic tolerance, the inhabitants of Latvia. The ethnic tolerance is taught as a set of conscious and unconscious orientations of the individual in social interaction and inter-ethnic communication. It uses the tools of empirical studies of the ethnic tolerance which allows to identify the explicitly and implicitly levels of the emotional component of Latvia's residents. Explicit measurements were made using the techniques of self-report which revealed the index of the ethnic tolerance and the ethnic identity of the participants. The implicit component was studied using methods based on the effect of the emotional priming. During the processing of the results, there were calculated indicators of the positive and negative implicit attitudes towards members of their own and other ethnicity as well as the explicit parameters of the ethnic tolerance and the ethnic identity of Latvia-s residents. The implicit measurements of the ratio of neighboring ethnic groups against each other showed a mutual negative attitude whereas the explicit measurements indicate a neutral attitude. The data obtained contribute to a further study of the ethnic tolerance of Latvia's residents.

Keywords: ethnic tolerance, implicit measure, priming, ethnic attitudes

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1586
7766 Efficient Real-time Remote Data Propagation Mechanism for a Component-Based Approach to Distributed Manufacturing

Authors: V. Barot, S. McLeod, R. Harrison, A. A. West

Abstract:

Manufacturing Industries face a crucial change as products and processes are required to, easily and efficiently, be reconfigurable and reusable. In order to stay competitive and flexible, situations also demand distribution of enterprises globally, which requires implementation of efficient communication strategies. A prototype system called the “Broadcaster" has been developed with an assumption that the control environment description has been engineered using the Component-based system paradigm. This prototype distributes information to a number of globally distributed partners via an adoption of the circular-based data processing mechanism. The work highlighted in this paper includes the implementation of this mechanism in the domain of the manufacturing industry. The proposed solution enables real-time remote propagation of machine information to a number of distributed supply chain client resources such as a HMI, VRML-based 3D views and remote client instances regardless of their distribution nature and/ or their mechanisms. This approach is presented together with a set of evaluation results. Authors- main concentration surrounds the reliability and the performance metric of the adopted approach. Performance evaluation is carried out in terms of the response times taken to process the data in this domain and compared with an alternative data processing implementation such as the linear queue mechanism. Based on the evaluation results obtained, authors justify the benefits achieved from this proposed implementation and highlight any further research work that is to be carried out.

Keywords: Broadcaster, circular buffer, Component-based, distributed manufacturing, remote data propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1367
7765 Human Action Recognition Based on Ridgelet Transform and SVM

Authors: A. Ouanane, A. Serir

Abstract:

In this paper, a novel algorithm based on Ridgelet Transform and support vector machine is proposed for human action recognition. The Ridgelet transform is a directional multi-resolution transform and it is more suitable for describing the human action by performing its directional information to form spatial features vectors. The dynamic transition between the spatial features is carried out using both the Principal Component Analysis and clustering algorithm K-means. First, the Principal Component Analysis is used to reduce the dimensionality of the obtained vectors. Then, the kmeans algorithm is then used to perform the obtained vectors to form the spatio-temporal pattern, called set-of-labels, according to given periodicity of human action. Finally, a Support Machine classifier is used to discriminate between the different human actions. Different tests are conducted on popular Datasets, such as Weizmann and KTH. The obtained results show that the proposed method provides more significant accuracy rate and it drives more robustness in very challenging situations such as lighting changes, scaling and dynamic environment

Keywords: Human action, Ridgelet Transform, PCA, K-means, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2065
7764 Supply Chain Modeling and Improving Manufacturing Industry in Developing Countries: A Research Agenda

Authors: F.B. Georgise, K. D. Thoben, M. Seifert

Abstract:

This paper presents a research agenda on the SCOR model adaptation. SCOR model is designated to measure supply chain performance and logistics impact across the boundaries of individual organizations. It is at its growing stage of its life cycle and is enjoying the leverage of becoming the industry standard. The SCOR model has been developed and used widely in developed countries context. This research focuses on the SCOR model adaptation for the manufacturing industry in developing countries. With a necessary understanding of the characteristics, difficulties and problems of the manufacturing industry in developing countries- supply chain; consequently, we will try to designs an adapted model with its building blocks: business process model, performance measures and best practices.

Keywords: developing countries, manufacturing industry, SCOR model adaptation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2210
7763 Identification of Risks Associated with Process Automation Systems

Authors: J. K. Visser, H. T. Malan

Abstract:

A need exists to identify the sources of risks associated with the process automation systems within petrochemical companies or similar energy related industries. These companies use many different process automation technologies in its value chain. A crucial part of the process automation system is the information technology component featuring in the supervisory control layer. The ever-changing technology within the process automation layers and the rate at which it advances pose a risk to safe and predictable automation system performance. The age of the automation equipment also provides challenges to the operations and maintenance managers of the plant due to obsolescence and unavailability of spare parts. The main objective of this research was to determine the risk sources associated with the equipment that is part of the process automation systems. A secondary objective was to establish whether technology managers and technicians were aware of the risks and share the same viewpoint on the importance of the risks associated with automation systems. A conceptual model for risk sources of automation systems was formulated from models and frameworks in literature. This model comprised six categories of risk which forms the basis for identifying specific risks. This model was used to develop a questionnaire that was sent to 172 instrument technicians and technology managers in the company to obtain primary data. 75 completed and useful responses were received. These responses were analyzed statistically to determine the highest risk sources and to determine whether there was difference in opinion between technology managers and technicians. The most important risks that were revealed in this study are: 1) the lack of skilled technicians, 2) integration capability of third-party system software, 3) reliability of the process automation hardware, 4) excessive costs pertaining to performing maintenance and migrations on process automation systems, and 5) requirements of having third-party communication interfacing compatibility as well as real-time communication networks.

Keywords: Distributed control system, identification of risks, information technology, process automation system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 951
7762 Development of Logic Model for R&D Program Plan Analysis in Preliminary Feasibility Study

Authors: Hyun-Kyu Kang

Abstract:

The Korean Government has applied the preliminary feasibility study to new government R&D program plans as a part of an evaluation system for R&D programs. The preliminary feasibility study for the R&D program is composed of 3 major criteria such as technological, policy and economic analysis. The program logic model approach is used as a part of the technological analysis in the preliminary feasibility study. We has developed and improved the R&D program logic model. The logic model is a very useful tool for evaluating R&D program plans. Using a logic model, we can generally identify important factors of the R&D program plan, analyze its logic flow and find the disconnection or jump in the logic flow among components of the logic model.

Keywords: Preliminary feasibility study, R&D program logic model, technological analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2148
7761 Stochastic Subspace Modelling of Turbulence

Authors: M. T. Sichani, B. J. Pedersen, S. R. K. Nielsen

Abstract:

Turbulence of the incoming wind field is of paramount importance to the dynamic response of civil engineering structures. Hence reliable stochastic models of the turbulence should be available from which time series can be generated for dynamic response and structural safety analysis. In the paper an empirical cross spectral density function for the along-wind turbulence component over the wind field area is taken as the starting point. The spectrum is spatially discretized in terms of a Hermitian cross-spectral density matrix for the turbulence state vector which turns out not to be positive definite. Since the succeeding state space and ARMA modelling of the turbulence rely on the positive definiteness of the cross-spectral density matrix, the problem with the non-positive definiteness of such matrices is at first addressed and suitable treatments regarding it are proposed. From the adjusted positive definite cross-spectral density matrix a frequency response matrix is constructed which determines the turbulence vector as a linear filtration of Gaussian white noise. Finally, an accurate state space modelling method is proposed which allows selection of an appropriate model order, and estimation of a state space model for the vector turbulence process incorporating its phase spectrum in one stage, and its results are compared with a conventional ARMA modelling method.

Keywords: Turbulence, wind turbine, complex coherence, state space modelling, ARMA modelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1637
7760 An Improved Model for Prediction of the Effective Thermal Conductivity of Nanofluids

Authors: K. Abbaspoursani, M. Allahyari, M. Rahmani

Abstract:

Thermal conductivity is an important characteristic of a nanofluid in laminar flow heat transfer. This paper presents an improved model for the prediction of the effective thermal conductivity of nanofluids based on dimensionless groups. The model expresses the thermal conductivity of a nanofluid as a function of the thermal conductivity of the solid and liquid, their volume fractions and particle size. The proposed model includes a parameter which accounts for the interfacial shell, brownian motion, and aggregation of particle. The validation of the model is verified by applying the results obtained by the experiments of Tio2-water and Al2o3-water nanofluids.

Keywords: Critical particle size, nanofluid, model, and thermal conductivity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2039
7759 Authentication Analysis of the 802.11i Protocol

Authors: Zeeshan Furqan, Shahabuddin Muhammad, Ratan Guha

Abstract:

IEEE has designed 802.11i protocol to address the security issues in wireless local area networks. Formal analysis is important to ensure that the protocols work properly without having to resort to tedious testing and debugging which can only show the presence of errors, never their absence. In this paper, we present the formal verification of an abstract protocol model of 802.11i. We translate the 802.11i protocol into the Strand Space Model and then prove the authentication property of the resulting model using the Strand Space formalism. The intruder in our model is imbued with powerful capabilities and repercussions to possible attacks are evaluated. Our analysis proves that the authentication of 802.11i is not compromised in the presented model. We further demonstrate how changes in our model will yield a successful man-in-the-middle attack.

Keywords: authentication, formal analysis, formal verification, security.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1511
7758 Learning the Dynamics of Articulated Tracked Vehicles

Authors: Mario Gianni, Manuel A. Ruiz Garcia, Fiora Pirri

Abstract:

In this work, we present a Bayesian non-parametric approach to model the motion control of ATVs. The motion control model is based on a Dirichlet Process-Gaussian Process (DP-GP) mixture model. The DP-GP mixture model provides a flexible representation of patterns of control manoeuvres along trajectories of different lengths and discretizations. The model also estimates the number of patterns, sufficient for modeling the dynamics of the ATV.

Keywords: Dirichlet processes, Gaussian processes, robot control learning, tracked vehicles.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1772
7757 Building an e-Learning System Model with Implications for Research and Instructional Use

Authors: Kuan-Chou Chen, Keh-Wen “Carin” Chuang

Abstract:

This paper demonstrates a model of an e-Learning system based on nowadays learning theory and distant education practice. The relationships in the model are designed to be simple and functional and do not necessarily represent any particular e- Learning environments. It is meant to be a generic e-Learning system model with implications for any distant education course instructional design. It allows online instructors to move away from the discrepancy between the courses and body of knowledge. The interrelationships of four primary sectors that are at the e-Learning system are presented in this paper. This integrated model includes [1] pedagogy, [2] technology, [3] teaching, and [4] learning. There are interactions within each of these sectors depicted by system loop map.

Keywords: e-Learning system, online courses instructionaldesign, integrated model, interrelationships.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1510
7756 Forecasting for Financial Stock Returns Using a Quantile Function Model

Authors: Yuzhi Cai

Abstract:

In this talk, we introduce a newly developed quantile function model that can be used for estimating conditional distributions of financial returns and for obtaining multi-step ahead out-of-sample predictive distributions of financial returns. Since we forecast the whole conditional distributions, any predictive quantity of interest about the future financial returns can be obtained simply as a by-product of the method. We also show an application of the model to the daily closing prices of Dow Jones Industrial Average (DJIA) series over the period from 2 January 2004 - 8 October 2010. We obtained the predictive distributions up to 15 days ahead for the DJIA returns, which were further compared with the actually observed returns and those predicted from an AR-GARCH model. The results show that the new model can capture the main features of financial returns and provide a better fitted model together with improved mean forecasts compared with conventional methods. We hope this talk will help audience to see that this new model has the potential to be very useful in practice.

Keywords: DJIA, Financial returns, predictive distribution, quantile function model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634