Search results for: standard methodology.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3021

Search results for: standard methodology.

2001 Finite Element Method Analysis of Occluded-Ear Simulator and Natural Human Ear Canal

Authors: M. Sasajima, T. Yamaguchi, Y. Hu, Y. Koike

Abstract:

In this paper, we discuss the propagation of sound in the narrow pathways of an occluded-ear simulator typically used for the measurement of insert-type earphones. The simulator has a standardized frequency response conforming to the international standard (IEC60318-4). In narrow pathways, the speed and phase of sound waves are modified by viscous air damping. In our previous paper, we proposed a new finite element method (FEM) to consider the effects of air viscosity in this type of audio equipment. In this study, we will compare the results from the ear simulator FEM model, and those from a three dimensional human ear canal FEM model made from computed tomography images, with the measured frequency response data from the ear canals of 18 people.

Keywords: Ear simulator, FEM, viscosity, human ear canal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1123
2000 Color Constancy using Superpixel

Authors: Xingsheng Yuan, Zhengzhi Wang

Abstract:

Color constancy algorithms are generally based on the simplified assumption about the spectral distribution or the reflection attributes of the scene surface. However, in reality, these assumptions are too restrictive. The methodology is proposed to extend existing algorithm to applying color constancy locally to image patches rather than globally to the entire images. In this paper, a method based on low-level image features using superpixels is proposed. Superpixel segmentation partition an image into regions that are approximately uniform in size and shape. Instead of using entire pixel set for estimating the illuminant, only superpixels with the most valuable information are used. Based on large scale experiments on real-world scenes, it can be derived that the estimation is more accurate using superpixels than when using the entire image.

Keywords: color constancy, illuminant estimation, superpixel

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1458
1999 Bank Business Models and The Changes in CEE Countries

Authors: I. Erins, J. Erina

Abstract:

The aim of this article is to assess the existing business models used by the banks operating in the CEE countries in the time period from 2006 till 2011. In order to obtain research results, the authors performed qualitative analysis of the scientific literature on bank business models, which have been grouped into clusters that consist of such components as: 1) capital and reserves; 2) assets; 3) deposits, and 4) loans. In their turn, bank business models have been developed based on the types of core activities of the banks, and have been divided into four groups: Wholesale, Investment, Retail and Universal Banks. Descriptive statistics have been used to analyse the models, determining mean, minimal and maximal values of constituent cluster components, as well as standard deviation. The analysis of the data is based on such bank variable indices as Return on Assets (ROA) and Return on Equity (ROE).

Keywords: Banks, Business model, CEE, ROA, ROE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1844
1998 Frequency Response of Complex Systems with Localized Nonlinearities

Authors: E. Menga, S. Hernandez

Abstract:

Finite Element Models (FEMs) are widely used in order to study and predict the dynamic properties of structures and usually, the prediction can be obtained with much more accuracy in the case of a single component than in the case of assemblies. Especially for structural dynamics studies, in the low and middle frequency range, most complex FEMs can be seen as assemblies made by linear components joined together at interfaces. From a modelling and computational point of view, these types of joints can be seen as localized sources of stiffness and damping and can be modelled as lumped spring/damper elements, most of time, characterized by nonlinear constitutive laws. On the other side, most of FE programs are able to run nonlinear analysis in time-domain. They treat the whole structure as nonlinear, even if there is one nonlinear degree of freedom (DOF) out of thousands of linear ones, making the analysis unnecessarily expensive from a computational point of view. In this work, a methodology in order to obtain the nonlinear frequency response of structures, whose nonlinearities can be considered as localized sources, is presented. The work extends the well-known Structural Dynamic Modification Method (SDMM) to a nonlinear set of modifications, and allows getting the Nonlinear Frequency Response Functions (NLFRFs), through an ‘updating’ process of the Linear Frequency Response Functions (LFRFs). A brief summary of the analytical concepts is given, starting from the linear formulation and understanding what the implications of the nonlinear one, are. The response of the system is formulated in both: time and frequency domain. First the Modal Database is extracted and the linear response is calculated. Secondly the nonlinear response is obtained thru the NL SDMM, by updating the underlying linear behavior of the system. The methodology, implemented in MATLAB, has been successfully applied to estimate the nonlinear frequency response of two systems. The first one is a two DOFs spring-mass-damper system, and the second example takes into account a full aircraft FE Model. In spite of the different levels of complexity, both examples show the reliability and effectiveness of the method. The results highlight a feasible and robust procedure, which allows a quick estimation of the effect of localized nonlinearities on the dynamic behavior. The method is particularly powerful when most of the FE Model can be considered as acting linearly and the nonlinear behavior is restricted to few degrees of freedom. The procedure is very attractive from a computational point of view because the FEM needs to be run just once, which allows faster nonlinear sensitivity analysis and easier implementation of optimization procedures for the calibration of nonlinear models.

Keywords: Frequency response, nonlinear dynamics, structural dynamic modification, softening effect, rubber.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1310
1997 Applying Wavelet Entropy Principle in Fault Classification

Authors: S. El Safty, A. El-Zonkoly

Abstract:

The ability to detect and classify the type of fault plays a great role in the protection of power system. This procedure is required to be precise with no time consumption. In this paper detection of fault type has been implemented using wavelet analysis together with wavelet entropy principle. The simulation of power system is carried out using PSCAD/EMTDC. Different types of faults were studied obtaining various current waveforms. These current waveforms were decomposed using wavelet analysis into different approximation and details. The wavelet entropy of such decompositions is analyzed reaching a successful methodology for fault classification. The suggested approach is tested using different fault types and proven successful identification for the type of fault.

Keywords: Fault classification, wavelet transform, waveletentropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1932
1996 Analyzing the Effects of Resource Relatedness on Strategic Alliances Performance

Authors: G. Chung, B. Choi

Abstract:

Very few studies have examined performance implications of strategic alliance announcements in the information technologies industry from a resource-based view. Furthermore, none of these studies have investigated resource congruence and alliance motive as potential sources of abnormal firm performance. This paper extends upon current resource-based literature to discover and explore linkages between these concepts and the practical performance of strategic alliances. This study finds that strategic alliance announcements have provided overall abnormal positive returns, and that marketing alliances with marketing resource incongruence have also contributed to significant firm performance.

Keywords: Event study methodology, resource-based theory, resource relatedness, strategic alliance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1639
1995 Mechanical Properties of Particle Boards from Maize Cob and Urea-Formaldehyde Resin

Authors: A. Danladi, I. O. Patrick

Abstract:

Particle boards were prepared from Maize cob (MC) and urea-formaldehyde resin (UFR) on compression moulding machine. The amount of MC was varied from 50-120g while 30g of UFR was kept constant. Some mechanical properties of the particle boards were tested using the standard ASM methods. The results show that as the MC content increased from 50- 120g in 30g UFR, the hardness increased from about 6.89 x 102 to7.51 x 102MPa. Impact strength decreased from 3.3x 10-2 to 0.45 x 10-2J/M2, while tensile strength initially increased from 2.63 x 102 to 3.14 x 102 MPa as the MC increased from 50 to 60g in 30g UFR, thereafter, it decreased to about 1.35 x 102MPa at 120g in 30g content.

Keywords: Hardness, Impact strength, Maize cob, Tensile strength and Urea-formaldehyde resin.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4490
1994 Robust H8 Fuzzy Control Design for Nonlinear Two-Time Scale System with Markovian Jumps based on LMI Approach

Authors: Wudhichai Assawinchaichote, Sing Kiong Nguang

Abstract:

This paper examines the problem of designing a robust H8 state-feedback controller for a class of nonlinear two-time scale systems with Markovian Jumps described by a Takagi-Sugeno (TS) fuzzy model. Based on a linear matrix inequality (LMI) approach, LMI-based sufficient conditions for the uncertain Markovian jump nonlinear two-time scale systems to have an H8 performance are derived. The proposed approach does not involve the separation of states into slow and fast ones and it can be applied not only to standard, but also to nonstandard nonlinear two-time scale systems. A numerical example is provided to illustrate the design developed in this paper.

Keywords: TS fuzzy, Markovian jumps, LMI, two-time scale systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1452
1993 An Effective Hybrid Genetic Algorithm for Job Shop Scheduling Problem

Authors: Bin Cai, Shilong Wang, Haibo Hu

Abstract:

The job shop scheduling problem (JSSP) is well known as one of the most difficult combinatorial optimization problems. This paper presents a hybrid genetic algorithm for the JSSP with the objective of minimizing makespan. The efficiency of the genetic algorithm is enhanced by integrating it with a local search method. The chromosome representation of the problem is based on operations. Schedules are constructed using a procedure that generates full active schedules. In each generation, a local search heuristic based on Nowicki and Smutnicki-s neighborhood is applied to improve the solutions. The approach is tested on a set of standard instances taken from the literature and compared with other approaches. The computation results validate the effectiveness of the proposed algorithm.

Keywords: Genetic algorithm, Job shop scheduling problem, Local search, Meta-heuristic algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1650
1992 A DEA Model for Performance Evaluation in The Presence of Time Lag Effect

Authors: Yanshuang Zhang, Byungho Jeong

Abstract:

Data Envelopment Analysis (DEA) is a methodology that computes efficiency values for decision making units (DMU) in a given period by comparing the outputs with the inputs. In many cases, there are some time lag between the consumption of inputs and the production of outputs. For a long-term research project, it is hard to avoid the production lead time phenomenon. This time lag effect should be considered in evaluating the performance of organizations. This paper suggests a model to calculate efficiency values for the performance evaluation problem with time lag. In the experimental part, the proposed methods are compared with the CCR and an existing time lag model using the data set of the 21st century frontier R&D program which is a long-term national R&D program of Korea.

Keywords: DEA, Efficiency, Time Lag

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1888
1991 Using Fractional Factorial Designs for Variable Importance in Random Forest Models

Authors: Ewa. M. Sztendur, Neil T. Diamond

Abstract:

Random Forests are a powerful classification technique, consisting of a collection of decision trees. One useful feature of Random Forests is the ability to determine the importance of each variable in predicting the outcome. This is done by permuting each variable and computing the change in prediction accuracy before and after the permutation. This variable importance calculation is similar to a one-factor-at a time experiment and therefore is inefficient. In this paper, we use a regular fractional factorial design to determine which variables to permute. Based on the results of the trials in the experiment, we calculate the individual importance of the variables, with improved precision over the standard method. The method is illustrated with a study of student attrition at Monash University.

Keywords: Random Forests, Variable Importance, Fractional Factorial Designs, Student Attrition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1992
1990 Currency Exchange Rate Forecasts Using Quantile Regression

Authors: Yuzhi Cai

Abstract:

In this paper, we discuss a Bayesian approach to quantile autoregressive (QAR) time series model estimation and forecasting. Together with a combining forecasts technique, we then predict USD to GBP currency exchange rates. Combined forecasts contain all the information captured by the fitted QAR models at different quantile levels and are therefore better than those obtained from individual models. Our results show that an unequally weighted combining method performs better than other forecasting methodology. We found that a median AR model can perform well in point forecasting when the predictive density functions are symmetric. However, in practice, using the median AR model alone may involve the loss of information about the data captured by other QAR models. We recommend that combined forecasts should be used whenever possible.

Keywords: Exchange rate, quantile regression, combining forecasts.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1775
1989 Determining a Suitable Maintenance Measure for Gentelligent Components Using Case-Based Reasoning

Authors: M. Winkens, P. Nyhuis

Abstract:

Components with sensory properties such as gentelligent components developed at the Collaborative Research Centre 653 offer a new angle in terms of the full utilization of the remaining service life as well as preventive maintenance. The developed methodology of component status driven maintenance analyzes the stress data obtained during the component's useful life and on the basis of this knowledge assesses the type of maintenance required in this case. The procedure is derived from the case-based reasoning method and will be explained in detail. The method's functionality is demonstrated with real-life data obtained during test runs of a racing car prototype.

Keywords: Gentelligent Components, Preventive Maintenance, Case based Reasoning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1902
1988 Modeling Methodologies for Optimization and Decision Support on Coastal Transport Information System (Co.Tr.I.S.)

Authors: Vassilios Moussas, Dimos N. Pantazis, Panagiotis Stratakis

Abstract:

The aim of this paper is to present the optimization methodology developed in the frame of a Coastal Transport Information System. The system will be used for the effective design of coastal transportation lines and incorporates subsystems that implement models, tools and techniques that may support the design of improved networks. The role of the optimization and decision subsystem is to provide the user with better and optimal scenarios that will best fulfill any constrains, goals or requirements posed. The complexity of the problem and the large number of parameters and objectives involved led to the adoption of an evolutionary method (Genetic Algorithms). The problem model and the subsystem structure are presented in detail, and, its support for simulation is also discussed.

Keywords: Coastal transport, modeling, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1997
1987 Accounting Performance of the Leading Companies in the Construction Sector in Brazil during the Period 2009-2012

Authors: Fabrício José Piacente, Vanessa de Cillos Silva, Thigo Luiz Mello Melato

Abstract:

The construction industry has been demonstrating increased growth and importance in Brazil’s national economic development. This study aims to evaluate the financial performance of the leading companies in the construction sector in Brazil in the period from 2009 to 2012. An analysis is made of the capital structure, liquidity, and profitability of the six largest companies in the construction sector in Brazil: Brookfield, Cyrela, Gafisa, MRV, PDG and Rossi. The results are then compared with standard industry ratios. It was found that among the companies analyzed, MRV and Cyrela showed the best relative performance in the period under consideration.

Keywords: Accounting ratios, construction, financial performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1889
1986 Dynamics of Nutrients Pool in the Baltic Sea Using the Ecosystem Model 3D-CEMBS

Authors: L. Dzierzbicka-Głowacka, M. Janecki

Abstract:

Seasonal variability of nutrients concentration in the Baltic Sea using the 3D ecosystem numerical model 3D-CEMBS has been investigated. Additionally this study shows horizontal and vertical distribution of nutrients in the Baltic Sea. Model domain is an extended Baltic Sea area divided into 600x640 horizontal grid cells. Aside from standard hydrodynamic parameters 3D-CEMBS produces modeled ecological variables such as: three types of phytoplankton, two detrital classes, dissolved oxygen and the nutrients (nitrate, ammonium, phosphate and silicate). The presented model allows prediction of parameters that describe distribution of nutrients concentration and phytoplankton biomass. 3D-CEMBS can be used to study the effect of different hydrodynamic and biogeochemical processes on distributions of these variables in a larger scale.

Keywords: ecosystem model, nutrients, Baltic Sea

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2479
1985 Supercompression for Full-HD and 4k-3D (8k)Digital TV Systems

Authors: Mario Mastriani

Abstract:

In this work, we developed the concept of supercompression, i.e., compression above the compression standard used. In this context, both compression rates are multiplied. In fact, supercompression is based on super-resolution. That is to say, supercompression is a data compression technique that superpose spatial image compression on top of bit-per-pixel compression to achieve very high compression ratios. If the compression ratio is very high, then we use a convolutive mask inside decoder that restores the edges, eliminating the blur. Finally, both, the encoder and the complete decoder are implemented on General-Purpose computation on Graphics Processing Units (GPGPU) cards. Specifically, the mentio-ned mask is coded inside texture memory of a GPGPU.

Keywords: General-Purpose computation on Graphics Processing Units, Image Compression, Interpolation, Super-resolution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1974
1984 Modeling of Heat and Mass Transfer in Soil Plant-Atmosphere. Influence of the Spatial Variability of Soil Hydrodynamic

Authors: Aouattou Nabila, Saighi Mohamed, Fekih Malika

Abstract:

The modeling of water transfer in the unsaturated zone uses techniques and methods of the soil physics to solve the Richards-s equation. However, there is a disaccord between the size of the measurements provided by the soil physics and the size of the fields of hydrological modeling problem, to which is added the strong spatial variability of soil hydraulic properties. The objective of this work was to develop a methodology to estimate the hydrodynamic parameters for modeling water transfers at different hydrological scales in the soil-plant atmosphere systems.

Keywords: Hydraulic properties, Modeling, Unsaturated zone, Transfer, Water

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1467
1983 Identification of Nonlinear Systems Structured by Hammerstein-Wiener Model

Authors: A. Brouri, F. Giri, A. Mkhida, F. Z. Chaoui, A. Elkarkri, M. L. Chhibat

Abstract:

Standard Hammerstein-Wiener models consist of a linear subsystem sandwiched by two memoryless nonlinearities. The problem of identifying Hammerstein-Wiener systems is addressed in the presence of linear subsystem of structure totally unknown and polynomial input and output nonlinearities. Presently, the system nonlinearities are allowed to be noninvertible. The system identification problem is dealt by developing a two-stage frequency identification method. First, the parameters of system nonlinearities are identified. In the second stage, a frequency approach is designed to estimate the linear subsystem frequency gain. All involved estimators are proved to be consistent.

Keywords: Nonlinear system identification, Hammerstein systems, Wiener systems, frequency identification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2397
1982 Induction Motor Design with Limited Harmonic Currents Using Particle Swarm Optimization

Authors: C. Thanga Raj, S. P. Srivastava, Pramod Agarwal

Abstract:

This paper presents an optimal design of poly-phase induction motor using Quadratic Interpolation based Particle Swarm Optimization (QI-PSO). The optimization algorithm considers the efficiency, starting torque and temperature rise as objective function (which are considered separately) and ten performance related items including harmonic current as constraints. The QI-PSO algorithm was implemented on a test motor and the results are compared with the Simulated Annealing (SA) technique, Standard Particle Swarm Optimization (SPSO), and normal design. Some benchmark problems are used for validating QI-PSO. From the test results QI-PSO gave better results and more suitable to motor-s design optimization. Cµ code is used for implementing entire algorithms.

Keywords: Design, harmonics, induction motor, particle swarm optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1786
1981 Adaptive Total Variation Based on Feature Scale

Authors: Jianbo Hu, Hongbao Wang

Abstract:

The widely used Total Variation de-noising algorithm can preserve sharp edge, while removing noise. However, since fixed regularization parameter over entire image, small details and textures are often lost in the process. In this paper, we propose a modified Total Variation algorithm to better preserve smaller-scaled features. This is done by allowing an adaptive regularization parameter to control the amount of de-noising in any region of image, according to relative information of local feature scale. Experimental results demonstrate the efficient of the proposed algorithm. Compared with standard Total Variation, our algorithm can better preserve smaller-scaled features and show better performance.

Keywords: Adaptive, de-noising, feature scale, regularizationparameter, Total Variation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1233
1980 Characterization of Lactose Consumption during the Biogas Production from Acid Whey by FT-IR Spectroscopy

Authors: K. Rugele, M. Gavare, M. Grube, K. Tihomirova, E. Skripsts, S. Larsson, J. Rubulis

Abstract:

The consumption of lactose in acid cheese whey anaerobic fermentation process under fed-batch conditions was studied. During fermentation for 100 hours the biogas production (CO2 and CH4) was analyzed online. Among the standard analyses FT-IR spectroscopy was used to follow the consumption of lactose by bacteria. The absorption bands at 990, 894 and 787 cm-1 in the 2nd derivative spectra were shown to be characteristic for lactose and were used to follow the lactose conversion. It was shown that acid cheese whey lactose was converted by bacteria in first 7 hours. In the spectra of 17, 18 and 95 hour fermentation samples lactose was not identified and these results correlated with the HPLC data.

Keywords: Acid whey, anaerobic digestion, biogas, FT-IR spectroscopy, lactose consumption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2487
1979 Bioclimatic Principles and Urban Open Spaces: The Case of Xanthi

Authors: Maria Giannopoulou

Abstract:

Open urban public spaces comprise an important element for the development of social, cultural and economic activities of the population in the modern cities. These spaces are also considered regulators of the region-s climate conditions, providing better thermal, visual and auditory conditions which can be optimized by the application of appropriate strategies of bioclimatic design. The paper focuses on the analysis and evaluation of the recent unification of the open spaces in the centre of Xanthi, a medium – size city in northern Greece, from a bioclimatic perspective, as well as in the creation of suitable methodology. It is based both on qualitative observation of the interventions by fieldwork research and assessment and on quantitative analysis and modeling of the research area.

Keywords: Bioclimatic principles, Quantitative analysis, Sustainability, TownScope III, Urban open spaces

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2782
1978 A Competitive Replica Placement Methodology for Ad Hoc Networks

Authors: Samee Ullah Khan, C. Ardil

Abstract:

In this paper, a mathematical model for data object replication in ad hoc networks is formulated. The derived model is general, flexible and adaptable to cater for various applications in ad hoc networks. We propose a game theoretical technique in which players (mobile hosts) continuously compete in a non-cooperative environment to improve data accessibility by replicating data objects. The technique incorporates the access frequency from mobile hosts to each data object, the status of the network connectivity, and communication costs. The proposed technique is extensively evaluated against four well-known ad hoc network replica allocation methods. The experimental results reveal that the proposed approach outperforms the four techniques in both the execution time and solution quality

Keywords: Data replication, auctions, static allocation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1398
1977 Novel GPU Approach in Predicting the Directional Trend of the S&P 500

Authors: A. J. Regan, F. J. Lidgey, M. Betteridge, P. Georgiou, C. Toumazou, K. Hayatleh, J. R. Dibble

Abstract:

Our goal is development of an algorithm capable of predicting the directional trend of the Standard and Poor’s 500 index (S&P 500). Extensive research has been published attempting to predict different financial markets using historical data testing on an in-sample and trend basis, with many authors employing excessively complex mathematical techniques. In reviewing and evaluating these in-sample methodologies, it became evident that this approach was unable to achieve sufficiently reliable prediction performance for commercial exploitation. For these reasons, we moved to an out-ofsample strategy based on linear regression analysis of an extensive set of financial data correlated with historical closing prices of the S&P 500. We are pleased to report a directional trend accuracy of greater than 55% for tomorrow (t+1) in predicting the S&P 500.

Keywords: Financial algorithm, GPU, S&P 500, stock market prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731
1976 Domain-based Key Management Scheme for Active Network

Authors: Jong-Whoi Shin, Soon-Tai Park, Chong-Sun Hwang

Abstract:

Active network was developed to solve the problem of the current sharing-based network–difficulty in applying new technology, service or standard, and duplicated operation at several protocol layers. Active network can transport the packet loaded with the executable codes, which enables to change the state of the network node. However, if the network node is placed in the sharing-based network, security and safety issues should be resolved. To satisfy this requirement, various security aspects are required such as authentication, authorization, confidentiality and integrity. Among these security components, the core factor is the encryption key. As a result, this study is designed to propose the scheme that manages the encryption key, which is used to provide security of the comprehensive active directory, based on the domain.

Keywords: Active Network, Domain-based Key Management, Security Components.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1668
1975 Comparison of Finite-Element and IEC Methods for Cable Thermal Analysis under Various Operating Environments

Authors: M. S. Baazzim, M. S. Al-Saud, M. A. El-Kady

Abstract:

In this paper, steady-state ampacity (current carrying capacity) evaluation of underground power cable system by using analytical and numerical methods for different conditions (depth of cable, spacing between phases, soil thermal resistivity, ambient temperature, wind speed), for two system voltage level were used 132 and 380 kV. The analytical method or traditional method that was used is based on the thermal analysis method developed by Neher-McGrath and further enhanced by International Electrotechnical Commission (IEC) and published in standard IEC 60287. The numerical method that was used is finite element method and it was recourse commercial software based on finite element method. 

Keywords: Cable ampacity, Finite element method, underground cable, thermal rating.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5853
1974 Anomaly Detection and Characterization to Classify Traffic Anomalies Case Study: TOT Public Company Limited Network

Authors: O. Siriporn, S. Benjawan

Abstract:

This paper represents four unsupervised clustering algorithms namely sIB, RandomFlatClustering, FarthestFirst, and FilteredClusterer that previously works have not been used for network traffic classification. The methodology, the result, the products of the cluster and evaluation of these algorithms with efficiency of each algorithm from accuracy are shown. Otherwise, the efficiency of these algorithms considering form the time that it use to generate the cluster quickly and correctly. Our work study and test the best algorithm by using classify traffic anomaly in network traffic with different attribute that have not been used before. We analyses the algorithm that have the best efficiency or the best learning and compare it to the previously used (K-Means). Our research will be use to develop anomaly detection system to more efficiency and more require in the future.

Keywords: Unsupervised, clustering, anomaly, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2105
1973 A Novel Web Metric for the Evaluation of Internet Trends

Authors: Radek Malinský, Ivan Jelínek

Abstract:

Web 2.0 (social networking, blogging and online forums) can serve as a data source for social science research because it contains vast amount of information from many different users. The volume of that information has been growing at a very high rate and becoming a network of heterogeneous data; this makes things difficult to find and is therefore not almost useful. We have proposed a novel theoretical model for gathering and processing data from Web 2.0, which would reflect semantic content of web pages in better way. This article deals with the analysis part of the model and its usage for content analysis of blogs. The introductory part of the article describes methodology for the gathering and processing data from blogs. The next part of the article is focused on the evaluation and content analysis of blogs, which write about specific trend.

Keywords: Blog, Sentiment Analysis, Web 2.0, Webometrics

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3538
1972 The Study of Synbiotic Dairy Products Rheological Properties during Shelf-Life

Authors: Ilze Beitane, Inga Ciprovica

Abstract:

The influence of lactulose and inulin on rheological properties of fermented milk during storage was studied.Pasteurized milk, freeze-dried starter culture Bb-12 (Bifidobacterium lactis, Chr. Hansen, Denmark), inulin – RAFTILINE®HP (ORAFI, Belgium) and syrup of lactulose (Duphalac®, the Netherlands) were used for experiments. The fermentation process was realized at 37 oC for 16 hours and the storage of products was provided at 4 oC for 7 days. Measurements were carried out by BROOKFIELD standard methods and the flow curves were described by Herschel-Bulkley model. The results of dispersion analysis have shown that both the concentration of prebiotics (p=0.04<0.05) and shelf life (p=0.003<0.05) have a significant influence on the apparent viscosity of the product.

Keywords: Apparent viscosity, B.lactis, consistency coefficient, flow behavior index, prebiotics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2245