Search results for: traditional approach
4178 An Improved Data Mining Method Applied to the Search of Relationship between Metabolic Syndrome and Lifestyles
Authors: Yi Chao Huang, Yu Ling Liao, Chiu Shuang Lin
Abstract:
A data cutting and sorting method (DCSM) is proposed to optimize the performance of data mining. DCSM reduces the calculation time by getting rid of redundant data during the data mining process. In addition, DCSM minimizes the computational units by splitting the database and by sorting data with support counts. In the process of searching for the relationship between metabolic syndrome and lifestyles with the health examination database of an electronics manufacturing company, DCSM demonstrates higher search efficiency than the traditional Apriori algorithm in tests with different support counts.Keywords: Data mining, Data cutting and sorting method, Apriori algorithm, Metabolic syndrome
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15884177 Scaling up Detection Rates and Reducing False Positives in Intrusion Detection using NBTree
Authors: Dewan Md. Farid, Nguyen Huu Hoa, Jerome Darmont, Nouria Harbi, Mohammad Zahidur Rahman
Abstract:
In this paper, we present a new learning algorithm for anomaly based network intrusion detection using improved self adaptive naïve Bayesian tree (NBTree), which induces a hybrid of decision tree and naïve Bayesian classifier. The proposed approach scales up the balance detections for different attack types and keeps the false positives at acceptable level in intrusion detection. In complex and dynamic large intrusion detection dataset, the detection accuracy of naïve Bayesian classifier does not scale up as well as decision tree. It has been successfully tested in other problem domains that naïve Bayesian tree improves the classification rates in large dataset. In naïve Bayesian tree nodes contain and split as regular decision-trees, but the leaves contain naïve Bayesian classifiers. The experimental results on KDD99 benchmark network intrusion detection dataset demonstrate that this new approach scales up the detection rates for different attack types and reduces false positives in network intrusion detection.Keywords: Detection rates, false positives, network intrusiondetection, naïve Bayesian tree.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22814176 A 3D Approach for Extraction of the Coronaryartery and Quantification of the Stenosis
Authors: Mahdi Mazinani, S. D. Qanadli, Rahil Hosseini, Tim Ellis, Jamshid Dehmeshki
Abstract:
Segmentation and quantification of stenosis is an important task in assessing coronary artery disease. One of the main challenges is measuring the real diameter of curved vessels. Moreover, uncertainty in segmentation of different tissues in the narrow vessel is an important issue that affects accuracy. This paper proposes an algorithm to extract coronary arteries and measure the degree of stenosis. Markovian fuzzy clustering method is applied to model uncertainty arises from partial volume effect problem. The algorithm employs: segmentation, centreline extraction, estimation of orthogonal plane to centreline, measurement of the degree of stenosis. To evaluate the accuracy and reproducibility, the approach has been applied to a vascular phantom and the results are compared with real diameter. The results of 10 patient datasets have been visually judged by a qualified radiologist. The results reveal the superiority of the proposed method compared to the Conventional thresholding Method (CTM) on both datasets.Keywords: 3D coronary artery tree extraction, segmentation, quantification, fuzzy clustering, and Markov random field
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15824175 International Migration of Highly Skilled Indian Professionals: A Case Study of Indian IT Professionals in Japan, Preliminary Results
Authors: Rimpi Rani
Abstract:
In the 2000s, a new migration trend of highly skilled Indian professionals towards Japan has appeared. This paper examines the factors that set off the incoming of highly skilled Indian professionals in Japan, mainly focusing on IT professionals’ immigration, and the reasons of the increase in their number. It investigates the influence of four factors: The Japanese immigration policy, the bilateral relations between India and Japan, the higher education system in India and the American H-1B visa policy with its cap system. This study concludes that increased and continuous supply of highly skilled Indian professionals have intensified the competition for migration to traditional destinations like the USA. This led Indian professionals to consider other options such as Japan.Keywords: International migration, India, Japan, highly skilled professionals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18364174 Adequacy of Object-Oriented Framework System-Based Testing Techniques
Authors: Jehad Al Dallal
Abstract:
An application framework provides a reusable design and implementation for a family of software systems. If the framework contains defects, the defects will be passed on to the applications developed from the framework. Framework defects are hard to discover at the time the framework is instantiated. Therefore, it is important to remove all defects before instantiating the framework. In this paper, two measures for the adequacy of an object-oriented system-based testing technique are introduced. The measures assess the usefulness and uniqueness of the testing technique. The two measures are applied to experimentally compare the adequacy of two testing techniques introduced to test objectoriented frameworks at the system level. The two considered testing techniques are the New Framework Test Approach and Testing Frameworks Through Hooks (TFTH). The techniques are also compared analytically in terms of their coverage power of objectoriented aspects. The comparison study results show that the TFTH technique is better than the New Framework Test Approach in terms of usefulness degree, uniqueness degree, and coverage power.Keywords: Object-oriented framework, object-oriented framework testing, test case generation, testing adequacy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14304173 The Effect of Sodium Chloride and pH on the Antimicrobial Effectiveness of Essential Oils Against Pathogenic and Food Spoilage Bacteria:Implications in Food Safety
Authors: P. O. Angienda, D. J. Hill
Abstract:
The purpose of this study was to elucidate the factors affecting antimicrobial effectiveness of essential oils against food spoilage and pathogenic bacteria. The minimum inhibition concentrations (MIC) of the essential oils, were determined by turbidimetric technique using Biocreen C, analyzer. The effects of pH ranging from 7.3 to 5.5 in absence and presence of essential oils and/or NaCl on the lag time and mean generation time of the bacteria at 370C, were carried out and results were determined showed that, combination of low pH and essential oil at 370C had additive effects against the test micro-organisms. The combination of 1.2 % (w/v) of NaCl and clove essential oil at 0.0325% (v/v) was effective against E. coli. The use of concentrations less than MIC in combination with low pH and or NaCl has the potential of being used as an alternative to “traditional food preservatives".
Keywords: Antimicrobial, Bacteria, Bioscreen C, essential oil.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 48484172 The Regional Concept, Public Policy and Policy Spaces: The ARC and TVA
Authors: Jay D. Gatrell, Robert Q. Hanham, Jeff Worsham, Maureen McDorman
Abstract:
This paper examines two policy spaces–the ARC and TVA–and their spatialized politics. The research observes that the regional concept informs public policy and can contribute to the formation of stable policy initiatives. Using the subsystem framework to understand the political viability of policy regimes, the authors conclude policy geographies that appeal to traditional definitions of regions are more stable over time. In contrast, geographies that fail to reflect pre-existing representations of space are engaged in more competitive subsystem politics. The paper demonstrates that the spatial practices of policy regions and their directional politics influence the political viability of programs. The paper concludes that policy spaces should institutionalize pre-existing geographies–not manufacture new ones.
Keywords: Agenda setting, politics, region.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14534171 Aplication`s Aspects Of Public Relations By Nonprofit Organizations. Case Study Albania
Authors: Xhiliola Agaraj(Shehu), Merita Murati, Valbona Gjini
Abstract:
The traditional public relations manager is usually responsible for maintaining and enhancing the reputation of the organization among key publics. While the principal focus of this effort is on support publics, it is quite clearly recognized that an organization's image has important effects on its own employees, its donors and volunteers, and its clients. The aim of paper is to define application`s aspects of public relations media and tools by nonprofit organizations in Albanian reality. Actually does used public relations media and tools, like written material, audiovisual material, organizational identity media, news, interviews and speeches, events, web sites by nonprofit organizations to attract donors? If, public relations media and tools are used, does exists a relation between public relation media and fundraising?
Keywords: Donors, Fundraising, Nonprofit Organizations, Public Relations
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14084170 ELD79-LGD2006 Transformation Techniques Implementation and Accuracy Comparison in Tripoli Area, Libya
Authors: Jamal A. Gledan, Othman A. Azzeidani
Abstract:
During the last decade, Libya established a new Geodetic Datum called Libyan Geodetic Datum 2006 (LGD 2006) by using GPS, whereas the ground traversing method was used to establish the last Libyan datum which was called the Europe Libyan Datum 79 (ELD79). The current research paper introduces ELD79 to LGD2006 coordinate transformation technique, the accurate comparison of transformation between multiple regression equations and the three – parameters model (Bursa-Wolf). The results had been obtained show that the overall accuracy of stepwise multi regression equations is better than that can be determined by using Bursa-Wolf transformation model.
Keywords: Geodetic datum, horizontal control points, traditional similarity transformation model, unconventional transformation techniques.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27404169 DC Link Floating for Grid Connected PV Converters
Authors: Attila Balogh, Eszter Varga, István Varjasi
Abstract:
Nowadays there are several grid connected converter in the grid system. These grid connected converters are generally the converters of renewable energy sources, industrial four quadrant drives and other converters with DC link. These converters are connected to the grid through a three phase bridge. The standards prescribe the maximal harmonic emission which could be easily limited with high switching frequency. The increased switching losses can be reduced to the half with the utilization of the wellknown Flat-top modulation. The suggested control method is the expansion of the Flat-top modulation with which the losses could be also reduced to the half compared to the Flat-top modulation. Comparing to traditional control these requirements can be simultaneously satisfied much better with the DLF (DC Link Floating) method.Keywords: DC link floating, high efficiency, PV converter, control method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22574168 Complex Network Approach to International Trade of Fossil Fuel
Authors: Semanur Soyyiğit Kaya, Ercan Eren
Abstract:
Energy has a prominent role for development of nations. Countries which have energy resources also have strategic power in the international trade of energy since it is essential for all stages of production in the economy. Thus, it is important for countries to analyze the weaknesses and strength of the system. On the other side, international trade is one of the fields that are analyzed as a complex network via network analysis. Complex network is one of the tools to analyze complex systems with heterogeneous agents and interaction between them. A complex network consists of nodes and the interactions between these nodes. Total properties which emerge as a result of these interactions are distinct from the sum of small parts (more or less) in complex systems. Thus, standard approaches to international trade are superficial to analyze these systems. Network analysis provides a new approach to analyze international trade as a network. In this network, countries constitute nodes and trade relations (export or import) constitute edges. It becomes possible to analyze international trade network in terms of high degree indicators which are specific to complex networks such as connectivity, clustering, assortativity/disassortativity, centrality, etc. In this analysis, international trade of crude oil and coal which are types of fossil fuel has been analyzed from 2005 to 2014 via network analysis. First, it has been analyzed in terms of some topological parameters such as density, transitivity, clustering etc. Afterwards, fitness to Pareto distribution has been analyzed via Kolmogorov-Smirnov test. Finally, weighted HITS algorithm has been applied to the data as a centrality measure to determine the real prominence of countries in these trade networks. Weighted HITS algorithm is a strong tool to analyze the network by ranking countries with regards to prominence of their trade partners. We have calculated both an export centrality and an import centrality by applying w-HITS algorithm to the data. As a result, impacts of the trading countries have been presented in terms of high-degree indicators.Keywords: Complex network approach, fossil fuel, international trade, network theory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23864167 Dependability Tools in Multi-Agent Support for Failures Analysis of Computer Networks
Authors: Myriam Noureddine
Abstract:
During their activity, all systems must be operational without failures and in this context, the dependability concept is essential avoiding disruption of their function. As computer networks are systems with the same requirements of dependability, this article deals with an analysis of failures for a computer network. The proposed approach integrates specific tools of the plat-form KB3, usually applied in dependability studies of industrial systems. The methodology is supported by a multi-agent system formed by six agents grouped in three meta agents, dealing with two levels. The first level concerns a modeling step through a conceptual agent and a generating agent. The conceptual agent is dedicated to the building of the knowledge base from the system specifications written in the FIGARO language. The generating agent allows producing automatically both the structural model and a dependability model of the system. The second level, the simulation, shows the effects of the failures of the system through a simulation agent. The approach validation is obtained by its application on a specific computer network, giving an analysis of failures through their effects for the considered network.
Keywords: Computer network, dependability, KB3 plat-form, multi-agent system, failure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6404166 Factors Influencing B2c eCommerce Diffusion
Authors: R. Mangiaracina, A. Perego, F. Campari
Abstract:
Despite the fact that B2c eCommerce has become important in numerous economies, its adoption varies from country to country. This paper aims to identify the factors affecting (enabling or inhibiting) B2c eCommerce and to determine their quantitative impact on the diffusion of online sales across countries. A dynamic panel model analyzing the relationship between 13 factors (Macroeconomic, Demographic, Socio-Cultural, Infrastructural and Offer related) stemming from a complete literature analysis and the B2c eCommerce value in 45 countries over 9 years has been developed. Having a positive correlation coefficient, GDP, mobile penetration, Internet user penetration and credit card penetration resulted as enabling drivers of the B2c eCommerce value across countries, whereas, having a negative correlation coefficient,equal distribution of income and the development of traditional retailing network act as inhibiting factors.Keywords: B2c eCommerce diffusion, influencing factors, dynamic panel model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35654165 The Rise of Nationalism among South Korean Youth and Democracy: An Analysis
Authors: Noor Sulastry Yurni Ahmad , Ki-Soo Eun
Abstract:
The 2008 Candlelight Protests of Korea was very significant to portray the political environment among the South Korean youth. Many challenges and new advanced technologies have driven the youth community to be engaged in the political arena that has shifted them from traditional Korean youth to a very greater community. Due to historical perspective with the people of North Korea, the young generation has embraced different view of ethnic nationalism. This study examines the youth involvement in politics in line with their level of acceptance the practice of democracy. The increase usage of new media has shown great results in the survey results whereby the youth used as a platform to gain political information and brought higher degree of their sociopolitical interests among them. Furthermore, the rise of nationalism and patriotism will be discussed in this paper to the dynamism of the political approaches used by the Korea governmentKeywords: Nationalism, new media, political participation, youth
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32074164 A Practical Approach for Electricity Load Forecasting
Authors: T. Rashid, T. Kechadi
Abstract:
This paper is a continuation of our daily energy peak load forecasting approach using our modified network which is part of the recurrent networks family and is called feed forward and feed back multi context artificial neural network (FFFB-MCANN). The inputs to the network were exogenous variables such as the previous and current change in the weather components, the previous and current status of the day and endogenous variables such as the past change in the loads. Endogenous variable such as the current change in the loads were used on the network output. Experiment shows that using endogenous and exogenous variables as inputs to the FFFBMCANN rather than either exogenous or endogenous variables as inputs to the same network produces better results. Experiments show that using the change in variables such as weather components and the change in the past load as inputs to the FFFB-MCANN rather than the absolute values for the weather components and past load as inputs to the same network has a dramatic impact and produce better accuracy.
Keywords: Daily peak load forecasting, feed forward and feedback multi-context neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18544163 A Complexity Measure for Java Bean based Software Components
Authors: Sandeep Khimta, Parvinder S. Sandhu, Amanpreet Singh Brar
Abstract:
The traditional software product and process metrics are neither suitable nor sufficient in measuring the complexity of software components, which ultimately is necessary for quality and productivity improvement within organizations adopting CBSE. Researchers have proposed a wide range of complexity metrics for software systems. However, these metrics are not sufficient for components and component-based system and are restricted to the module-oriented systems and object-oriented systems. In this proposed study it is proposed to find the complexity of the JavaBean Software Components as a reflection of its quality and the component can be adopted accordingly to make it more reusable. The proposed metric involves only the design issues of the component and does not consider the packaging and the deployment complexity. In this way, the software components could be kept in certain limit which in turn help in enhancing the quality and productivity.Keywords: JavaBean Components, Complexity, Metrics, Validation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15274162 Forensic Speaker Verification in Noisy Environmental by Enhancing the Speech Signal Using ICA Approach
Authors: Ahmed Kamil Hasan Al-Ali, Bouchra Senadji, Ganesh Naik
Abstract:
We propose a system to real environmental noise and channel mismatch for forensic speaker verification systems. This method is based on suppressing various types of real environmental noise by using independent component analysis (ICA) algorithm. The enhanced speech signal is applied to mel frequency cepstral coefficients (MFCC) or MFCC feature warping to extract the essential characteristics of the speech signal. Channel effects are reduced using an intermediate vector (i-vector) and probabilistic linear discriminant analysis (PLDA) approach for classification. The proposed algorithm is evaluated by using an Australian forensic voice comparison database, combined with car, street and home noises from QUT-NOISE at a signal to noise ratio (SNR) ranging from -10 dB to 10 dB. Experimental results indicate that the MFCC feature warping-ICA achieves a reduction in equal error rate about (48.22%, 44.66%, and 50.07%) over using MFCC feature warping when the test speech signals are corrupted with random sessions of street, car, and home noises at -10 dB SNR.Keywords: Noisy forensic speaker verification, ICA algorithm, MFCC, MFCC feature warping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9914161 Airfoils Aerodynamic Efficiency Study in Heavy Rain via Two Phase Flow Approach
Authors: M. Ismail, Cao Yihua, Zhao Ming
Abstract:
Heavy rainfall greatly affects the aerodynamic performance of the aircraft. There are many accidents of aircraft caused by aerodynamic efficiency degradation by heavy rain. In this Paper we have studied the heavy rain effects on the aerodynamic efficiency of NACA 64-210 & NACA 0012 airfoils. For our analysis, CFD method and preprocessing grid generator are used as our main analytical tools, and the simulation of rain is accomplished via two phase flow approach-s Discrete Phase Model (DPM). Raindrops are assumed to be non-interacting, non-deforming, non-evaporating and non-spinning spheres. Both airfoil sections exhibited significant reduction in lift and increase in drag for a given lift condition in simulated rain. The most significant difference between these two airfoils was the sensitivity of the NACA 64-210 to liquid water content (LWC), while NACA 0012 performance losses in the rain environment is not a function of LWC . It is expected that the quantitative information gained in this paper will be useful to the operational airline industry and greater effort such as small scale and full scale flight tests should put in this direction to further improve aviation safety.
Keywords: airfoil, discrete phase modeling, heavy rain, Reynolds number
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36414160 The Design, Development, and Optimization of a Capacitive Pressure Sensor Utilizing an Existing 9 DOF Platform
Authors: Andrew Randles, Ilker Ocak, Cheam Daw Don, Navab Singh, Alex Gu
Abstract:
Nine Degrees of Freedom (9 DOF) systems are already in development in many areas. In this paper, an integrated pressure sensor is proposed that will make use of an already existing monolithic 9 DOF inertial MEMS platform. Capacitive pressure sensors can suffer from limited sensitivity for a given size of membrane. This novel pressure sensor design increases the sensitivity by over 5 times compared to a traditional array of square diaphragms while still fitting within a 2 mm x 2 mm chip and maintaining a fixed static capacitance. The improved design uses one large diaphragm supported by pillars with fixed electrodes placed above the areas of maximum deflection. The design optimization increases the sensitivity from 0.22 fF/kPa to 1.16 fF/kPa. Temperature sensitivity was also examined through simulation.Keywords: Capacitive pressure sensor, 9 DOF, 10 DOF, sensor, capacitive, inertial measurement unit, IMU, inertial navigation system, INS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23284159 Optical 3D-Surface Reconstruction of Weak Textured Objects Based on an Approach of Disparity Stereo Inspection
Authors: Thomas Kerstein, Martin Laurowski, Philipp Klein, Michael Weyrich, Hubert Roth, Jürgen Wahrburg
Abstract:
Optical 3D measurement of objects is meaningful in numerous industrial applications. In various cases shape acquisition of weak textured objects is essential. Examples are repetition parts made of plastic or ceramic such as housing parts or ceramic bottles as well as agricultural products like tubers. These parts are often conveyed in a wobbling way during the automated optical inspection. Thus, conventional 3D shape acquisition methods like laser scanning might fail. In this paper, a novel approach for acquiring 3D shape of weak textured and moving objects is presented. To facilitate such measurements an active stereo vision system with structured light is proposed. The system consists of multiple camera pairs and auxiliary laser pattern generators. It performs the shape acquisition within one shot and is beneficial for rapid inspection tasks. An experimental setup including hardware and software has been developed and implemented.Keywords: automated optical inspection, depth from structured light, stereo vision, surface reconstruction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18424158 Unsupervised Outlier Detection in Streaming Data Using Weighted Clustering
Authors: Yogita, Durga Toshniwal
Abstract:
Outlier detection in streaming data is very challenging because streaming data cannot be scanned multiple times and also new concepts may keep evolving. Irrelevant attributes can be termed as noisy attributes and such attributes further magnify the challenge of working with data streams. In this paper, we propose an unsupervised outlier detection scheme for streaming data. This scheme is based on clustering as clustering is an unsupervised data mining task and it does not require labeled data, both density based and partitioning clustering are combined for outlier detection. In this scheme partitioning clustering is also used to assign weights to attributes depending upon their respective relevance and weights are adaptive. Weighted attributes are helpful to reduce or remove the effect of noisy attributes. Keeping in view the challenges of streaming data, the proposed scheme is incremental and adaptive to concept evolution. Experimental results on synthetic and real world data sets show that our proposed approach outperforms other existing approach (CORM) in terms of outlier detection rate, false alarm rate, and increasing percentages of outliers.
Keywords: Concept Evolution, Irrelevant Attributes, Streaming Data, Unsupervised Outlier Detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26374157 Design Standardization in Aramco: Strategic Analysis
Authors: Mujahid S. Alharbi
Abstract:
The construction of process plants in oil and gas-producing countries, such as Saudi Arabia, necessitates substantial investment in design and building. Each new plant, while unique, includes common building types, suggesting an opportunity for design standardization. This study investigates the adoption of standardized Issue for Construction (IFC) packages for non-process buildings in Saudi Aramco. A SWOT analysis presents the strengths, weaknesses, opportunities, and threats of this approach. The approach's benefits are illustrated using the Hawiyah Unayzah Gas Reservoir Storage Program (HUGRSP) as a case study. Standardization not only offers significant cost savings and operational efficiencies, but also expedites project timelines, reduces the potential for change orders, and fosters local economic growth by allocating building tasks to local contractors. Standardization also improves project management by easing interface constraints between different contractors and promoting adaptability to future industry changes. This research underscores the standardization of non-process buildings as a powerful strategy for cost optimization, efficiency enhancement, and local economic development in process plant construction within the oil and gas sector.
Keywords: Building, construction, management, project, standardization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 634156 Automated Process Quality Monitoring with Prediction of Fault Condition Using Measurement Data
Authors: Hyun-Woo Cho
Abstract:
Detection of incipient abnormal events is important to improve safety and reliability of machine operations and reduce losses caused by failures. Improper set-ups or aligning of parts often leads to severe problems in many machines. The construction of prediction models for predicting faulty conditions is quite essential in making decisions on when to perform machine maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of machine measurement data. The calibration model is used to predict two faulty conditions from historical reference data. This approach utilizes genetic algorithms (GA) based variable selection, and we evaluate the predictive performance of several prediction methods using real data. The results shows that the calibration model based on supervised probabilistic principal component analysis (SPPCA) yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.Keywords: Prediction, operation monitoring, on-line data, nonlinear statistical methods, empirical model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16584155 Proposal for Cost Calculation of Warehouse Processes and Its Usage for Setting Standards for Performance Evaluation
Authors: Tomas Cechura, Michal Simon
Abstract:
This paper describes a proposal for cost calculation of warehouse processes and its usage for setting standards for performance evaluation. One of the most common options of monitoring process performance is benchmarking. The typical outcome is whether the monitored object is better or worse than an average or standard. Traditional approaches, however, cannot find any specific opportunities to improve performance or eliminate inefficiencies in processes. Higher process efficiency can be achieved for example by cost reduction assuming that the same output is generated. However, costs can be reduced only if we know their structure and we are able to calculate them accurately. In the warehouse process area it is rather difficult because in most cases we have available only aggregated values with low explanatory ability. The aim of this paper is to create a suitable method for calculating the storage costs. At the end is shown a practical example of process calculation.
Keywords: Calculation, Costs, Performance, Process, Warehouse.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 93244154 Streamwise Conduction of Nanofluidic Flow in Microchannels
Authors: Yew Mun Hung, Ching Sze Lim, Tiew Wei Ting, Ningqun Guo
Abstract:
The effect of streamwise conduction on the thermal characteristics of forced convection for nanofluidic flow in rectangular microchannel heat sinks under isothermal wall has been investigated. By applying the fin approach, models with and without streamwise conduction term in the energy equation were developed for hydrodynamically and thermally fully-developed flow. These two models were solved to obtain closed form analytical solutions for the nanofluid and solid wall temperature distributions and the analysis emphasized details of the variations induced by the streamwise conduction on the nanofluid heat transport characteristics. The effects of the Peclet number, nanoparticle volume fraction, thermal conductivity ratio on the thermal characteristics of forced convection in microchannel heat sinks are analyzed. Due to the anomalous increase in the effective thermal conductivity of nanofluid compared to its base fluid, the effect of streamwise conduction is expected to be more significant. This study reveals the significance of the effect of streamwise conduction under certain conditions of which the streamwise conduction should not be neglected in the forced convective heat transfer analysis of microchannel heat sinks.Keywords: fin approach, microchannel heat sink, nanofluid, streamwise conduction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17404153 Person Identification by Using AR Model for EEG Signals
Authors: Gelareh Mohammadi, Parisa Shoushtari, Behnam Molaee Ardekani, Mohammad B. Shamsollahi
Abstract:
A direct connection between ElectroEncephaloGram (EEG) and the genetic information of individuals has been investigated by neurophysiologists and psychiatrists since 1960-s; and it opens a new research area in the science. This paper focuses on the person identification based on feature extracted from the EEG which can show a direct connection between EEG and the genetic information of subjects. In this work the full EO EEG signal of healthy individuals are estimated by an autoregressive (AR) model and the AR parameters are extracted as features. Here for feature vector constitution, two methods have been proposed; in the first method the extracted parameters of each channel are used as a feature vector in the classification step which employs a competitive neural network and in the second method a combination of different channel parameters are used as a feature vector. Correct classification scores at the range of 80% to 100% reveal the potential of our approach for person classification/identification and are in agreement to the previous researches showing evidence that the EEG signal carries genetic information. The novelty of this work is in the combination of AR parameters and the network type (competitive network) that we have used. A comparison between the first and the second approach imply preference of the second one.Keywords: Person Identification, Autoregressive Model, EEG, Neural Network
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17414152 Semi-Automatic Analyzer to Detect Authorial Intentions in Scientific Documents
Authors: Kanso Hassan, Elhore Ali, Soule-dupuy Chantal, Tazi Said
Abstract:
Information Retrieval has the objective of studying models and the realization of systems allowing a user to find the relevant documents adapted to his need of information. The information search is a problem which remains difficult because the difficulty in the representing and to treat the natural languages such as polysemia. Intentional Structures promise to be a new paradigm to extend the existing documents structures and to enhance the different phases of documents process such as creation, editing, search and retrieval. The intention recognition of the author-s of texts can reduce the largeness of this problem. In this article, we present intentions recognition system is based on a semi-automatic method of extraction the intentional information starting from a corpus of text. This system is also able to update the ontology of intentions for the enrichment of the knowledge base containing all possible intentions of a domain. This approach uses the construction of a semi-formal ontology which considered as the conceptualization of the intentional information contained in a text. An experiments on scientific publications in the field of computer science was considered to validate this approach.Keywords: Information research, text analyzes, intentionalstructure, segmentation, ontology, natural language processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16384151 Eco-Innovation as a New Sustainable Development Strategy: Case Studies
Authors: Orhan Çoban, Nuryağdı Rozıyev, Fehmi Karasioğlu
Abstract:
Sustainable development is one of the most debated issues, recently. In terms of providing more livable Earth continuity, while Production activities are going on, on the other hand protecting the environment has importance. As a strategy for sustainable development, eco-innovation is the application of innovations to reduce environmental burdens. Endeavors to understand ecoinnovation processes have been affected from environmental economics and innovation economics from neoclassical economics, and evolutionary economics other than neoclassical economics. In the light of case study analyses, this study aims to display activities in this field through case studies after explaining the theoretical framework of eco-innovations. This study consists of five sections including introduction and conclusion. In the second part of the study identifications of the concepts related with eco-innovation are described and eco-innovations are classified. Third section considers neoclassical and evolutionary approaches from neoclassical economics and evolutionary economics, respectively. Fourth section gives the case studies of successful eco-innovations. Last section is the conclusion part and offers suggestions for future eco-innovation research according to the theoretical framework and the case studies.Keywords: Sustainable Development, Innovation, Ecoinnovation, Neoclassical Approach, Evolutionary Approach, Case Studies
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20494150 A New Face Detection Technique using 2D DCT and Self Organizing Feature Map
Authors: Abdallah S. Abdallah, A. Lynn Abbott, Mohamad Abou El-Nasr
Abstract:
This paper presents a new technique for detection of human faces within color images. The approach relies on image segmentation based on skin color, features extracted from the two-dimensional discrete cosine transform (DCT), and self-organizing maps (SOM). After candidate skin regions are extracted, feature vectors are constructed using DCT coefficients computed from those regions. A supervised SOM training session is used to cluster feature vectors into groups, and to assign “face" or “non-face" labels to those clusters. Evaluation was performed using a new image database of 286 images, containing 1027 faces. After training, our detection technique achieved a detection rate of 77.94% during subsequent tests, with a false positive rate of 5.14%. To our knowledge, the proposed technique is the first to combine DCT-based feature extraction with a SOM for detecting human faces within color images. It is also one of a few attempts to combine a feature-invariant approach, such as color-based skin segmentation, together with appearance-based face detection. The main advantage of the new technique is its low computational requirements, in terms of both processing speed and memory utilization.Keywords: Face detection, skin color segmentation, self-organizingmap.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25434149 Data Envelopment Analysis with Partially Perfect Objects
Authors: Alexander Y. Vaninsky
Abstract:
This paper presents a simplified version of Data Envelopment Analysis (DEA) - a conventional approach to evaluating the performance and ranking of competitive objects characterized by two groups of factors acting in opposite directions: inputs and outputs. DEA with a Perfect Object (DEA PO) augments the group of actual objects with a virtual Perfect Object - the one having greatest outputs and smallest inputs. It allows for obtaining an explicit analytical solution and making a step to an absolute efficiency. This paper develops this approach further and introduces a DEA model with Partially Perfect Objects. DEA PPO consecutively eliminates the smallest relative inputs or greatest relative outputs, and applies DEA PO to the reduced collections of indicators. The partial efficiency scores are combined to get the weighted efficiency score. The computational scheme remains simple, like that of DEA PO, but the advantage of the DEA PPO is taking into account all of the inputs and outputs for each actual object. Firm evaluation is considered as an example.
Keywords: Data Envelopment Analysis, Perfect object, Partially perfect object, Partial efficiency, Explicit solution, Simplified algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1697