Search results for: information model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10361

Search results for: information model

6461 Criticality Assessment of Failures in Multipoint Communication Networks

Authors: Myriam Noureddine, Rachid Noureddine

Abstract:

Following the current economic challenges and competition, all systems, whatever their field, must be efficient and operational during their activity. In this context, it is imperative to anticipate, identify, eliminate and estimate the failures of systems, which may lead to an interruption of their function. This need requires the management of possible risks, through an assessment of the failures criticality following a dependability approach. On the other hand, at the time of new information technologies and considering the networks field evolution, the data transmission has evolved towards a multipoint communication, which can simultaneously transmit information from a sender to multiple receivers. This article proposes the failures criticality assessment of a multipoint communication network, integrates a database of network failures and their quantifications. The proposed approach is validated on a case study and the final result allows having the criticality matrix associated with failures on the considered network, giving the identification of acceptable risks.

Keywords: Dependability, failure, multipoint network, criticality matrix.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1595
6460 ICCFMS - Enhancing a Competitive Advantage for Thailand’s IT Entrepreneurs

Authors: T. Niracharapa, W. Angkana

Abstract:

Since information and communication technology (ICT) plays a critical role in enhancing national competitiveness, it is a driving force for social and economic growth and prosperity. The ASEAN Economic Community (AEC) will integrate this into ASEAN countries as a new mechanism and a measure that will improve economic performance as a global economy. Government policies may support or impede such harmonization. This study was to investigate, analyze the status of Thai IT entrepreneurs and define key strategies to enhance their competitive advantage. Data were collected based on in-depth interviews, questionnaires, focus groups, seminars and fieldwork on information technology excluding communication. SWOT was used as a tool to analyze the study. The results of this study can be used to enable the government to guide policy, measures and strategies for creating a competitive advantage for Thailand’s IT entrepreneurs in the global market.

Keywords: AEC, ASEAN, competitive advantage, IT entrepreneurs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2078
6459 Generic Data Warehousing for Consumer Electronics Retail Industry

Authors: S. Habte, K. Ouazzane, P. Patel, S. Patel

Abstract:

The dynamic and highly competitive nature of the consumer electronics retail industry means that businesses in this industry are experiencing different decision making challenges in relation to pricing, inventory control, consumer satisfaction and product offerings. To overcome the challenges facing retailers and create opportunities, we propose a generic data warehousing solution which can be applied to a wide range of consumer electronics retailers with a minimum configuration. The solution includes a dimensional data model, a template SQL script, a high level architectural descriptions, ETL tool developed using C#, a set of APIs, and data access tools. It has been successfully applied by ASK Outlets Ltd UK resulting in improved productivity and enhanced sales growth.

Keywords: Consumer electronics retail, dimensional data model, data analysis, generic data warehousing, reporting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1375
6458 Image Modeling Using Gibbs-Markov Random Field and Support Vector Machines Algorithm

Authors: Refaat M Mohamed, Ayman El-Baz, Aly A. Farag

Abstract:

This paper introduces a novel approach to estimate the clique potentials of Gibbs Markov random field (GMRF) models using the Support Vector Machines (SVM) algorithm and the Mean Field (MF) theory. The proposed approach is based on modeling the potential function associated with each clique shape of the GMRF model as a Gaussian-shaped kernel. In turn, the energy function of the GMRF will be in the form of a weighted sum of Gaussian kernels. This formulation of the GMRF model urges the use of the SVM with the Mean Field theory applied for its learning for estimating the energy function. The approach has been tested on synthetic texture images and is shown to provide satisfactory results in retrieving the synthesizing parameters.

Keywords: Image Modeling, MRF, Parameters Estimation, SVM Learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1627
6457 Biosorption of Azo Dye Reactive Black B onto Nonviable Biomass of Cladosporium cladosporioides LM1: Thermodynamic, Kinetic and Equilibrium Modeling

Authors: L. A. S. Dionel, B. A. P. Santos, V. C. P. Lopes, L. G. Vasconcelos, M. A. Soares, E. B. Morais

Abstract:

This study investigated the biosorption of the azo dye reactive Black B (RBB) from aqueous solution using the nonviable biomass of Cladosporium cladosporioides LM1. The biosorption systems were carried out in batch mode considering different conditions of initial pH, contact time, temperature, initial dye concentration and biosorbent dosage. Higher removal rate of RBB was obtained at pH 2. Biosorption data were successfully described by pseudo-second-order kinetic model and Langmuir isotherm model with the maximum monolayer biosorption capacity estimated at 71.43 mg/g. The values of thermodynamic parameters such as ∆G°, ∆H° and ∆S° indicated that the biosorption of RBB onto fungal biomass was spontaneous and exothermic in nature. It can be concluded that nonviable biomass of Cladosporium cladosporioides LM1 may be an attractive low-cost biosorbent for the removal of azo dye RBB from aqueous solution.

Keywords: Color removal, isotherms and kinetics models, thermodynamic studies, fungus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 842
6456 Bail-in Capital: The New Box

Authors: Manu Krishnan, Phil Jacoby

Abstract:

In this paper, we discuss the paradigm shift in bank capital from the “gone concern" to the “going concern" mindset. We then propose a methodology for pricing a product of this shift called Contingent Capital Notes (“CoCos"). The Merton Model can determine a price for credit risk by using the firm-s equity value as a call option on those assets. Our pricing methodology for CoCos also uses the credit spread implied by the Merton Model in a subsequent derivative form created by John Hull et al . Here, a market implied asset volatility is calculated by using observed market CDS spreads. This implied asset volatility is then used to estimate the probability of triggering a predetermined “contingency event" given the distanceto- trigger (DTT). The paper then investigates the effect of varying DTTs and recovery assumptions on the CoCo yield. We conclude with an investment rationale.

Keywords: CoCo, Contingent capital, Bank Capital, Tier1 Capital

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1539
6455 Random Access in IoT Using Naïve Bayes Classification

Authors: Alhusein Almahjoub, Dongyu Qiu

Abstract:

This paper deals with the random access procedure in next-generation networks and presents the solution to reduce total service time (TST) which is one of the most important performance metrics in current and future internet of things (IoT) based networks. The proposed solution focuses on the calculation of optimal transmission probability which maximizes the success probability and reduces TST. It uses the information of several idle preambles in every time slot, and based on it, it estimates the number of backlogged IoT devices using Naïve Bayes estimation which is a type of supervised learning in the machine learning domain. The estimation of backlogged devices is necessary since optimal transmission probability depends on it and the eNodeB does not have information about it. The simulations are carried out in MATLAB which verify that the proposed solution gives excellent performance.

Keywords: Random access, LTE/LTE-A, 5G, machine learning, Naïve Bayes estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 434
6454 What is the Key Element for the Territory's State of Development?

Authors: J. Lonska, V. Boronenko

Abstract:

The result of process of territory-s development is the territory-s state of development (TSoD), which is pointed towards the provision and improvement of people-s life conditions. The authors offer to measure the TSoD according to their own developed model. Using the available statistical data regarding the values of model-s elements, the authors empirically show which element mainly determines the TSoD. The findings of the research showed that the key elements of the TSoD are the “Material welfare of people" and “People-s health". Performing a deeper statistical analysis of correlation between these elements, it turned out that it is not so necessary for a country to be bent on trying to increase the material growth of a territory, because a relatively high index of life expectancy at birth could be ensured also by much more modest material resources. On the other hand, the economical feedback of longer lifespan within countries with lower material performance is also relatively low.

Keywords: Development indices, health, territory's state of development, wealth.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1184
6453 Smart Energy Consumers: An Empirical Investigation on the Intention to Adopt Innovative Consumption Behaviour

Authors: Cecilia Perri, Vincenzo Corvello

Abstract:

The aim of the present study is to investigate consumers' determinants of intention toward the adoption of Smart Grid solutions and technologies. Ajzen's Theory of Planned Behaviour (TPB) model is applied and tested to explain the formation of such adoption intention. An exogenous variable, taking into account the resistance to change of individuals, was added to the basic model. The elicitation study allowed obtaining salient modal beliefs, which were used, with the support of literature, to design the questionnaire. After the screening phase, data collected from the main survey were analysed for evaluating measurement model's reliability and validity. Consistent with the theory, the results of structural equation analysis revealed that attitude, subjective norm, and perceived behavioural control positively, which affected the adoption intention. Specifically, the variable with the highest estimate loading factor was found to be the perceived behavioural control, and, the most important belief related to each construct was determined (e.g., energy saving was observed to be the most significant belief linked with attitude). Further investigation indicated that the added exogenous variable has a negative influence on intention; this finding confirmed partially the hypothesis, since this influence was indirect: such relationship was mediated by attitude. Implications and suggestions for future research are discussed.

Keywords: Adoption of innovation, consumers behaviour, energy management, smart grid, theory of planned behaviour.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2096
6452 Energy Consumption and GHG Production in Railway and Road Passenger Regional Transport

Authors: Martin Kendra, Tomas Skrucany, Jozef Gnap, Jan Ponicky

Abstract:

Paper deals with the modeling and simulation of energy consumption and GHG production of two different modes of regional passenger transport – road and railway. These two transport modes use the same type of fuel – diesel. Modeling and simulation of the energy consumption in transport is often used due to calculation satisfactory accuracy and cost efficiency. Paper deals with the calculation based on EN standards and information collected from technical information from vehicle producers and characteristics of tracks. Calculation included maximal theoretical capacity of bus and train and real passenger’s measurement from operation. Final energy consumption and GHG production is calculated by using software simulation. In evaluation of the simulation is used system “well to wheel”.

Keywords: Bus, energy consumption, GHG, production, simulation, train.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1559
6451 Evaluation of Exerting Force on the Heating Surface Due to Bubble Ebullition in Subcooled Flow Boiling

Authors: M. R. Nematollahi

Abstract:

Vibration characteristics of subcooled flow boiling on thin and long structures such as a heating rod were recently investigated by the author. The results show that the intensity of the subcooled boiling-induced vibration (SBIV) was influenced strongly by the conditions of the subcooling temperature, linear power density and flow velocity. Implosive bubble formation and collapse are the main nature of subcooled boiling, and their behaviors are the only sources to originate from SBIV. Therefore, in order to explain the phenomenon of SBIV, it is essential to obtain reliable information about bubble behavior in subcooled boiling conditions. This was investigated at different conditions of coolant subcooling temperatures of 25 to 75°C, coolant flow velocities of 0.16 to 0.53m/s, and linear power densities of 100 to 600 W/cm. High speed photography at 13,500 frames per second was performed at these conditions. The results show that even at the highest subcooling condition, the absolute majority of bubbles collapse very close to the surface after detaching from the heating surface. Based on these observations, a simple model of surface tension and momentum change is introduced to offer a rough quantitative estimate of the force exerted on the heating surface during the bubble ebullition. The formation of a typical bubble in subcooled boiling is predicted to exert an excitation force in the order of 10-4 N.

Keywords: Subcooled boiling, vibration mechanism, bubble behavior.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1535
6450 Towards an Enhanced Stochastic Simulation Model for Risk Analysis in Highway Construction

Authors: Anshu Manik, William G. Buttlar, Kasthurirangan Gopalakrishnan

Abstract:

Over the years, there is a growing trend towards quality-based specifications in highway construction. In many Quality Control/Quality Assurance (QC/QA) specifications, the contractor is primarily responsible for quality control of the process, whereas the highway agency is responsible for testing the acceptance of the product. A cooperative investigation was conducted in Illinois over several years to develop a prototype End-Result Specification (ERS) for asphalt pavement construction. The final characteristics of the product are stipulated in the ERS and the contractor is given considerable freedom in achieving those characteristics. The risk for the contractor or agency depends on how the acceptance limits and processes are specified. Stochastic simulation models are very useful in estimating and analyzing payment risk in ERS systems and these form an integral part of the Illinois-s prototype ERS system. This paper describes the development of an innovative methodology to estimate the variability components in in-situ density, air voids and asphalt content data from ERS projects. The information gained from this would be crucial in simulating these ERS projects for estimation and analysis of payment risks associated with asphalt pavement construction. However, these methods require at least two parties to conduct tests on all the split samples obtained according to the sampling scheme prescribed in present ERS implemented in Illinois.

Keywords: Asphalt Pavement, Risk Analysis, StochasticSimulation, QC/QA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1504
6449 Mean Velocity Modeling of Open-Channel Flow with Submerged Rigid Vegetation

Authors: M. Morri, A. Soualmia, P. Belleudy

Abstract:

Vegetation affects the mean and turbulent flow structure. It may increase flood risks and sediment transport. Therefore, it is important to develop analytical approaches for the bed shear stress on vegetated bed, to predict resistance caused by vegetation. In the recent years, experimental and numerical models have both been developed to model the effects of submerged vegetation on open-channel flow. In this paper, different analytic models are compared and tested using the criteria of deviation, to explore their capacity for predicting the mean velocity and select the suitable one that will be applied in real case of rivers. The comparison between the measured data in vegetated flume and simulated mean velocities indicated, a good performance, in the case of rigid vegetation, whereas, Huthoff model shows the best agreement with a high coefficient of determination (R2=80%) and the smallest error in the prediction of the average velocities.

Keywords: Analytic Models, Comparison, Mean Velocity, Vegetation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2531
6448 Parameter Estimation for Viewing Rank Distribution of Video-on-Demand

Authors: Hyoup-Sang Yoon

Abstract:

Video-on-demand (VOD) is designed by using content delivery networks (CDN) to minimize the overall operational cost and to maximize scalability. Estimation of the viewing pattern (i.e., the relationship between the number of viewings and the ranking of VOD contents) plays an important role in minimizing the total operational cost and maximizing the performance of the VOD systems. In this paper, we have analyzed a large body of commercial VOD viewing data and found that the viewing rank distribution fits well with the parabolic fractal distribution. The weighted linear model fitting function is used to estimate the parameters (coefficients) of the parabolic fractal distribution. This paper presents an analytical basis for designing an optimal hierarchical VOD contents distribution system in terms of its cost and performance.

Keywords: VOD, CDN, parabolic fractal distribution, viewing rank, weighted linear model fitting

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1782
6447 Transient Stress Analysis on Medium Modules Spur Gear by Using Mode Super Position Technique

Authors: Ali Raad Hassan

Abstract:

Natural frequencies and dynamic response of a spur gear sector are investigated using a two dimensional finite element model that offers significant advantages for dynamic gear analyses. The gear teeth are analyzed for different operating speeds. A primary feature of this modeling is determination of mesh forces using a detailed contact analysis for each time step as the gears roll through the mesh. ANSYS software has been used on the proposed model to find the natural frequencies by Block Lanczos technique and displacements and dynamic stresses by transient mode super position method. The effect of rotational speed of the gear on the dynamic response of gear tooth has been studied and design limits have been discussed.

Keywords: Spur gear, Natural frequency, transient analysis, Mode super position technique.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2973
6446 Robust Regression and its Application in Financial Data Analysis

Authors: Mansoor Momeni, Mahmoud Dehghan Nayeri, Ali Faal Ghayoumi, Hoda Ghorbani

Abstract:

This research is aimed to describe the application of robust regression and its advantages over the least square regression method in analyzing financial data. To do this, relationship between earning per share, book value of equity per share and share price as price model and earning per share, annual change of earning per share and return of stock as return model is discussed using both robust and least square regressions, and finally the outcomes are compared. Comparing the results from the robust regression and the least square regression shows that the former can provide the possibility of a better and more realistic analysis owing to eliminating or reducing the contribution of outliers and influential data. Therefore, robust regression is recommended for getting more precise results in financial data analysis.

Keywords: Financial data analysis, Influential data, Outliers, Robust regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1927
6445 Software Development for the Kinematic Analysis of a Lynx 6 Robot Arm

Authors: Baki Koyuncu, Mehmet Güzel

Abstract:

The kinematics of manipulators is a central problem in the automatic control of robot manipulators. Theoretical background for the analysis of the 5 Dof Lynx-6 educational Robot Arm kinematics is presented in this paper. The kinematics problem is defined as the transformation from the Cartesian space to the joint space and vice versa. The Denavit-Harbenterg (D-H) model of representation is used to model robot links and joints in this study. Both forward and inverse kinematics solutions for this educational manipulator are presented, An effective method is suggested to decrease multiple solutions in inverse kinematics. A visual software package, named MSG, is also developed for testing Motional Characteristics of the Lynx-6 Robot arm. The kinematics solutions of the software package were found to be identical with the robot arm-s physical motional behaviors.

Keywords: Lynx 6, robot arm, forward kinematics, inverse kinematics, software, DH parameters, 5 DOF , SSC-32 , simulator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5354
6444 A Metric-Set and Model Suggestion for Better Software Project Cost Estimation

Authors: Murat Ayyıldız, Oya Kalıpsız, Sırma Yavuz

Abstract:

Software project effort estimation is frequently seen as complex and expensive for individual software engineers. Software production is in a crisis. It suffers from excessive costs. Software production is often out of control. It has been suggested that software production is out of control because we do not measure. You cannot control what you cannot measure. During last decade, a number of researches on cost estimation have been conducted. The metric-set selection has a vital role in software cost estimation studies; its importance has been ignored especially in neural network based studies. In this study we have explored the reasons of those disappointing results and implemented different neural network models using augmented new metrics. The results obtained are compared with previous studies using traditional metrics. To be able to make comparisons, two types of data have been used. The first part of the data is taken from the Constructive Cost Model (COCOMO'81) which is commonly used in previous studies and the second part is collected according to new metrics in a leading international company in Turkey. The accuracy of the selected metrics and the data samples are verified using statistical techniques. The model presented here is based on Multi-Layer Perceptron (MLP). Another difficulty associated with the cost estimation studies is the fact that the data collection requires time and care. To make a more thorough use of the samples collected, k-fold, cross validation method is also implemented. It is concluded that, as long as an accurate and quantifiable set of metrics are defined and measured correctly, neural networks can be applied in software cost estimation studies with success

Keywords: Software Metrics, Software Cost Estimation, Neural Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1952
6443 Impact of Changes of the Conceptual Framework for Financial Reporting on the Indicators of the Financial Statement

Authors: Nadezhda Kvatashidze

Abstract:

The International Accounting Standards Board updated the conceptual framework for financial reporting. The main reason behind it is to resolve the tasks of the accounting, which are caused by the market development and business-transactions of a new economic content. Also, the investors call for higher transparency of information and responsibility for the results in order to make a more accurate risk assessment and forecast. All these make it necessary to further develop the conceptual framework for financial reporting so that the users get useful information. The market development and certain shortcomings of the conceptual framework revealed in practice require its reconsideration and finding new solutions. Some issues and concepts, such as disclosure and supply of information, its qualitative characteristics, assessment, and measurement uncertainty had to be supplemented and perfected. The criteria of recognition of certain elements (assets and liabilities) of reporting had to be updated, too and all this is set out in the updated edition of the conceptual framework for financial reporting, a comprehensive collection of concepts underlying preparation of the financial statement. The main objective of conceptual framework revision is to improve financial reporting and development of clear concepts package. This will support International Accounting Standards Board (IASB) to set common “Approach & Reflection” for similar transactions on the basis of mutually accepted concepts. As a result, companies will be able to develop coherent accounting policies for those transactions or events that are occurred from particular deals to which no standard is used or when standard allows choice of accounting policy.

Keywords: Conceptual framework, measurement basis, measurement uncertainty, neutrality, prudence, stewardship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2479
6442 Adoption and Diffusion of E-Government Services in India: The Impact of User Demographics and Service Quality

Authors: Sayantan Khanra, Rojers P. Joseph

Abstract:

This study attempts to analyze the impact of demography and service quality on the adoption and diffusion of e-Government services in the context of India. The objective of this paper is to study the users' perception about e-Government services and investigate the key variables that are most salient to the Indian populace. At the completion of this study, a research model that would help to understand the relationship involving the demographic variables and service quality dimensions, and the willingness to adopt e-Government services is expected to be developed. Dedicated authorities, particularly those in developing economies, may use that model or its augmented versions to design and update e-Government services and promote their use among citizens. After all, enhanced public participation is required to improve efficiency, engagement and transparency in the implementation of the aforementioned services.

Keywords: Adoption and diffusion of e-Government services, demographic variables, hierarchical regression analysis, service quality dimensions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1771
6441 Deep-Learning Based Approach to Facial Emotion Recognition Through Convolutional Neural Network

Authors: Nouha Khediri, Mohammed Ben Ammar, Monji Kherallah

Abstract:

Recently, facial emotion recognition (FER) has become increasingly essential to understand the state of the human mind. However, accurately classifying emotion from the face is a challenging task. In this paper, we present a facial emotion recognition approach named CV-FER benefiting from deep learning, especially CNN and VGG16. First, the data are pre-processed with data cleaning and data rotation. Then, we augment the data and proceed to our FER model, which contains five convolutions layers and five pooling layers. Finally, a softmax classifier is used in the output layer to recognize emotions. Based on the above contents, this paper reviews the works of facial emotion recognition based on deep learning. Experiments show that our model outperforms the other methods using the same FER2013 database and yields a recognition rate of 92%. We also put forward some suggestions for future work.

Keywords: CNN, deep-learning, facial emotion recognition, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 691
6440 Jitter Transfer in High Speed Data Links

Authors: Tsunwai Gary Yip

Abstract:

Phase locked loops for data links operating at 10 Gb/s or faster are low phase noise devices designed to operate with a low jitter reference clock. Characterization of their jitter transfer function is difficult because the intrinsic noise of the device is comparable to the random noise level in the reference clock signal. A linear model is proposed to account for the intrinsic noise of a PLL. The intrinsic noise data of a PLL for 10 Gb/s links is presented. The jitter transfer function of a PLL in a test chip for 12.8 Gb/s data links was determined in experiments using the 400 MHz reference clock as the source of simultaneous excitations over a wide range of frequency. The result shows that the PLL jitter transfer function can be approximated by a second order linear model.

Keywords: Intrinsic phase noise, jitter in data link, PLL jitter transfer function, high speed clocking in electronic circuit

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1938
6439 Distortion of Flow Measurement and Cavitation Occurs Due to Orifice Inlet Velocity Profiles

Authors: Byung-Soo Shin, Nam-Seok Kim, Sang-Kyu Lee, O-Hyun Keum

Abstract:

This analysis investigates the distortion of flow measurement and the increase of cavitation along orifice flowmeter. The analysis using the numerical method (CFD) validated the distortion of flow measurement through the inlet velocity profile considering the convergence and grid dependency. Realizable k-e model was selected and y+ was about 50 in this numerical analysis. This analysis also estimated the vulnerability of cavitation effect due to inlet velocity profile. The investigation concludes that inclined inlet velocity profile could vary the pressure which was measured at pressure tab near pipe wall and it led to distort the pressure values ranged from -3.8% to 5.3% near the orifice plate and to make the increase of cavitation. The investigation recommends that the fully developed inlet velocity flow is beneficial to accurate flow measurement in orifice flowmeter.

Keywords: Orifice, k-e model, CFD

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2208
6438 Estimation Model of Dry Docking Duration Using Data Mining

Authors: Isti Surjandari, Riara Novita

Abstract:

Maintenance is one of the most important activities in the shipyard industry. However, sometimes it is not supported by adequate services from the shipyard, where inaccuracy in estimating the duration of the ship maintenance is still common. This makes estimation of ship maintenance duration is crucial. This study uses Data Mining approach, i.e., CART (Classification and Regression Tree) to estimate the duration of ship maintenance that is limited to dock works or which is known as dry docking. By using the volume of dock works as an input to estimate the maintenance duration, 4 classes of dry docking duration were obtained with different linear model and job criteria for each class. These linear models can then be used to estimate the duration of dry docking based on job criteria.

Keywords: Classification and regression tree (CART), data mining, dry docking, maintenance duration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2428
6437 Investigation of the Cooling and Uniformity Effectiveness in a Sinter Packed Bed

Authors: Uzu-Kuei Hsu, Chang-Hsien Tai, Kai-Wun Jin

Abstract:

When sinters are filled into the cooler from the sintering machine, and the non-uniform distribution of the sinters leads to uneven cooling. This causes the temperature difference of the sinters leaving the cooler to be so large that it results in the conveyors being deformed by the heat. The present work applies CFD method to investigate the thermo flowfield phenomena in a sinter cooler by the Porous Media Model. Using the obtained experimental data to simulate porosity (Ε), permeability (κ), inertial coefficient (F), specific heat (Cp) and effective thermal conductivity (keff) of the sinter packed beds. The physical model is a similar geometry whose Darcy numbers (Da) are similar to the sinter cooler. Using the Cooling Index (CI) and Uniformity Index (UI) to analyze the thermo flowfield in the sinter packed bed obtains the cooling performance of the sinter cooler.

Keywords: Porous media, sinter, cooling index, uniformity index, CFD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1656
6436 A Linearization and Decomposition Based Approach to Minimize the Non-Productive Time in Transfer Lines

Authors: Hany Osman, M. F. Baki

Abstract:

We address the balancing problem of transfer lines in this paper to find the optimal line balancing that minimizes the nonproductive time. We focus on the tool change time and face orientation change time both of which influence the makespane. We consider machine capacity limitations and technological constraints associated with the manufacturing process of auto cylinder heads. The problem is represented by a mixed integer programming model that aims at distributing the design features to workstations and sequencing the machining processes at a minimum non-productive time. The proposed model is solved by an algorithm established using linearization schemes and Benders- decomposition approach. The experiments show the efficiency of the algorithm in reaching the exact solution of small and medium problem instances at reasonable time.

Keywords: Transfer line balancing, Benders' decomposition, Linearization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1720
6435 A Survey on Quasi-Likelihood Estimation Approaches for Longitudinal Set-ups

Authors: Naushad Mamode Khan

Abstract:

The Com-Poisson (CMP) model is one of the most popular discrete generalized linear models (GLMS) that handles both equi-, over- and under-dispersed data. In longitudinal context, an integer-valued autoregressive (INAR(1)) process that incorporates covariate specification has been developed to model longitudinal CMP counts. However, the joint likelihood CMP function is difficult to specify and thus restricts the likelihood-based estimating methodology. The joint generalized quasi-likelihood approach (GQL-I) was instead considered but is rather computationally intensive and may not even estimate the regression effects due to a complex and frequently ill-conditioned covariance structure. This paper proposes a new GQL approach for estimating the regression parameters (GQL-III) that is based on a single score vector representation. The performance of GQL-III is compared with GQL-I and separate marginal GQLs (GQL-II) through some simulation experiments and is proved to yield equally efficient estimates as GQL-I and is far more computationally stable.

Keywords: Longitudinal, Com-Poisson, Ill-conditioned, INAR(1), GLMS, GQL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1770
6434 A Robust Image Steganography Method Using PMM in Bit Plane Domain

Authors: Souvik Bhattacharyya, Aparajita Khan, Indradip Banerjee, Gautam Sanyal

Abstract:

Steganography is the art and science that hides the information in an appropriate cover carrier like image, text, audio and video media. In this work the authors propose a new image based steganographic method for hiding information within the complex bit planes of the image. After slicing into bit planes the cover image is analyzed to extract the most complex planes in decreasing order based on their bit plane complexity. The complexity function next determines the complex noisy blocks of the chosen bit plane and finally pixel mapping method (PMM) has been used to embed secret bits into those regions of the bit plane. The novel approach of using pixel mapping method (PMM) in bit plane domain adaptively embeds data on most complex regions of image, provides high embedding capacity, better imperceptibility and resistance to steganalysis attack.

Keywords: PMM (Pixel Mapping Method), Bit Plane, Steganography, SSIM, KL-Divergence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2860
6433 Evaluation of State of the Art IDS Message Exchange Protocols

Authors: Robert Koch, Mario Golling, Gabi Dreo

Abstract:

During the last couple of years, the degree of dependence on IT systems has reached a dimension nobody imagined to be possible 10 years ago. The increased usage of mobile devices (e.g., smart phones), wireless sensor networks and embedded devices (Internet of Things) are only some examples of the dependency of modern societies on cyber space. At the same time, the complexity of IT applications, e.g., because of the increasing use of cloud computing, is rising continuously. Along with this, the threats to IT security have increased both quantitatively and qualitatively, as recent examples like STUXNET or the supposed cyber attack on Illinois water system are proofing impressively. Once isolated control systems are nowadays often publicly available - a fact that has never been intended by the developers. Threats to IT systems don’t care about areas of responsibility. Especially with regard to Cyber Warfare, IT threats are no longer limited to company or industry boundaries, administrative jurisdictions or state boundaries. One of the important countermeasures is increased cooperation among the participants especially in the field of Cyber Defence. Besides political and legal challenges, there are technical ones as well. A better, at least partially automated exchange of information is essential to (i) enable sophisticated situational awareness and to (ii) counter the attacker in a coordinated way. Therefore, this publication performs an evaluation of state of the art Intrusion Detection Message Exchange protocols in order to guarantee a secure information exchange between different entities.

Keywords: Cyber Defence, Cyber Warfare, Intrusion Detection Information Exchange, Early Warning Systems, Joint Intrusion Detection, Cyber Conflict

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2285
6432 An Approach to Task Modeling for User Interface Design

Authors: Costin Pribeanu

Abstract:

The model-based approach to user interface design relies on developing separate models capturing various aspects about users, tasks, application domain, presentation and dialog structures. This paper presents a task modeling approach for user interface design and aims at exploring mappings between task, domain and presentation models. The basic idea of our approach is to identify typical configurations in task and domain models and to investigate how they relate each other. A special emphasis is put on applicationspecific functions and mappings between domain objects and operational task structures. In this respect, we will address two layers in task decomposition: a functional (planning) layer and an operational layer.

Keywords: task modeling, user interface design, unit tasks, basic tasks, operational task model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1870