Search results for: Approach Tendency
3765 Analysis of the Diffusion Behavior of an Information and Communication Technology Platform for City Logistics
Authors: Giulio Mangano, Alberto De Marco, Giovanni Zenezini
Abstract:
The concept of City Logistics (CL) has emerged to improve the impacts of last mile freight distribution in urban areas. In this paper, a System Dynamics (SD) model exploring the dynamics of the diffusion of a ICT platform for CL management across different populations is proposed. For the development of the model two sources have been used. On the one hand, the major diffusion variables and feedback loops are derived from a literature review of existing diffusion models. On the other hand, the parameters are represented by the value propositions delivered by the platform as a response to some of the users’ needs. To extract the most important value propositions the Business Model Canvas approach has been used. Such approach in fact focuses on understanding how a company can create value for her target customers. These variables and parameters are thus translated into a SD diffusion model with three different populations namely municipalities, logistics service providers, and own account carriers. Results show that, the three populations under analysis fully adopt the platform within the simulation time frame, highlighting a strong demand by different stakeholders for CL projects aiming at carrying out more efficient urban logistics operations.
Keywords: City logistics, simulation, system dynamics, business model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10313764 Knowledge Representation and Retrieval in Design Project Memory
Authors: Smain M. Bekhti, Nada T. Matta
Abstract:
Knowledge sharing in general and the contextual access to knowledge in particular, still represent a key challenge in the knowledge management framework. Researchers on semantic web and human machine interface study techniques to enhance this access. For instance, in semantic web, the information retrieval is based on domain ontology. In human machine interface, keeping track of user's activity provides some elements of the context that can guide the access to information. We suggest an approach based on these two key guidelines, whilst avoiding some of their weaknesses. The approach permits a representation of both the context and the design rationale of a project for an efficient access to knowledge. In fact, the method consists of an information retrieval environment that, in the one hand, can infer knowledge, modeled as a semantic network, and on the other hand, is based on the context and the objectives of a specific activity (the design). The environment we defined can also be used to gather similar project elements in order to build classifications of tasks, problems, arguments, etc. produced in a company. These classifications can show the evolution of design strategies in the company.Keywords: Project Memory, Knowledge re-use, Design rationale, Knowledge representation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16293763 Object Detection in Digital Images under Non-Standardized Conditions Using Illumination and Shadow Filtering
Authors: Waqqas-ur-Rehman Butt, Martin Servin, Marion Pause
Abstract:
In recent years, object detection has gained much attention and very encouraging research area in the field of computer vision. The robust object boundaries detection in an image is demanded in numerous applications of human computer interaction and automated surveillance systems. Many methods and approaches have been developed for automatic object detection in various fields, such as automotive, quality control management and environmental services. Inappropriately, to the best of our knowledge, object detection under illumination with shadow consideration has not been well solved yet. Furthermore, this problem is also one of the major hurdles to keeping an object detection method from the practical applications. This paper presents an approach to automatic object detection in images under non-standardized environmental conditions. A key challenge is how to detect the object, particularly under uneven illumination conditions. Image capturing conditions the algorithms need to consider a variety of possible environmental factors as the colour information, lightening and shadows varies from image to image. Existing methods mostly failed to produce the appropriate result due to variation in colour information, lightening effects, threshold specifications, histogram dependencies and colour ranges. To overcome these limitations we propose an object detection algorithm, with pre-processing methods, to reduce the interference caused by shadow and illumination effects without fixed parameters. We use the Y CrCb colour model without any specific colour ranges and predefined threshold values. The segmented object regions are further classified using morphological operations (Erosion and Dilation) and contours. Proposed approach applied on a large image data set acquired under various environmental conditions for wood stack detection. Experiments show the promising result of the proposed approach in comparison with existing methods.Keywords: Image processing, Illumination equalization, Shadow filtering, Object detection, Colour models, Image segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10223762 Performance Management of Tangible Assets within the Balanced Scorecard and Interactive Business Decision Tools
Authors: Raymond K. Jonkers
Abstract:
The present study investigated approaches and techniques to enhance strategic management governance and decision making within the framework of a performance-based balanced scorecard. The review of best practices from strategic, program, process, and systems engineering management provided for a holistic approach toward effective outcome-based capability management. One technique, based on factorial experimental design methods, was used to develop an empirical model. This model predicted the degree of capability effectiveness and is dependent on controlled system input variables and their weightings. These variables represent business performance measures, captured within a strategic balanced scorecard. The weighting of these measures enhances the ability to quantify causal relationships within balanced scorecard strategy maps. The focus in this study was on the performance of tangible assets within the scorecard rather than the traditional approach of assessing performance of intangible assets such as knowledge and technology. Tangible assets are represented in this study as physical systems, which may be thought of as being aboard a ship or within a production facility. The measures assigned to these systems include project funding for upgrades against demand, system certifications achieved against those required, preventive maintenance to corrective maintenance ratios, and material support personnel capacity against that required for supporting respective systems. The resultant scorecard is viewed as complimentary to the traditional balanced scorecard for program and performance management. The benefits from these scorecards are realized through the quantified state of operational capabilities or outcomes. These capabilities are also weighted in terms of priority for each distinct system measure and aggregated and visualized in terms of overall state of capabilities achieved. This study proposes the use of interactive controls within the scorecard as a technique to enhance development of alternative solutions in decision making. These interactive controls include those for assigning capability priorities and for adjusting system performance measures, thus providing for what-if scenarios and options in strategic decision-making. In this holistic approach to capability management, several cross functional processes were highlighted as relevant amongst the different management disciplines. In terms of assessing an organization’s ability to adopt this approach, consideration was given to the P3M3 management maturity model.
Keywords: Outcome based management, performance management, lifecycle costs, balanced scorecard.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13533761 A Novel Approach to Handle Uncertainty in Health System Variables for Hospital Admissions
Authors: Manisha Rathi, Thierry Chaussalet
Abstract:
Hospital staff and managers are under pressure and concerned for effective use and management of scarce resources. The hospital admissions require many decisions that have complex and uncertain consequences for hospital resource utilization and patient flow. It is challenging to predict risk of admissions and length of stay of a patient due to their vague nature. There is no method to capture the vague definition of admission of a patient. Also, current methods and tools used to predict patients at risk of admission fail to deal with uncertainty in unplanned admission, LOS, patients- characteristics. The main objective of this paper is to deal with uncertainty in health system variables, and handles uncertain relationship among variables. An introduction of machine learning techniques along with statistical methods like Regression methods can be a proposed solution approach to handle uncertainty in health system variables. A model that adapts fuzzy methods to handle uncertain data and uncertain relationships can be an efficient solution to capture the vague definition of admission of a patient.Keywords: Admission, Fuzzy, Regression, Uncertainty
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14223760 Examination of Pre-Tender Budgeting Techniques for Mechanical and Electrical Services in Malaysia
Authors: Ganiyu Amuda Yusuf, Sarajul Fikri Mohamed
Abstract:
The procurement and cost management approach adopted for mechanical and electrical (M&E) services in Malaysian construction industry have been criticized for its inefficiency. The study examined early cost estimating practices adopted for mechanical and electrical services (M&E) in Malaysia so as to understand the level of compliance of the current techniques with best practices. The methodology adopted for the study is a review of bidding documents used on both completed and on – going building projects awarded between 2008 – 2010 under 9th Malaysian Plan. The analysis revealed that, M&E services cost cannot be reliably estimated at pre-contract stage; the bidding techniques adopted for M&E services failed to provide uniform basis for contractors to submit tender; detailed measurement of items were not made which could complicate post contract cost control and financial management. The paper concluded that, there is need to follow a structured approach in determining the pre-contract cost estimate for M&E services which will serve as a virile tool for post contract cost control.
Keywords: Cost Management, Mechanical and Electrical Services, Procurement, Standard Method of Measurement
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19133759 Scaling up Detection Rates and Reducing False Positives in Intrusion Detection using NBTree
Authors: Dewan Md. Farid, Nguyen Huu Hoa, Jerome Darmont, Nouria Harbi, Mohammad Zahidur Rahman
Abstract:
In this paper, we present a new learning algorithm for anomaly based network intrusion detection using improved self adaptive naïve Bayesian tree (NBTree), which induces a hybrid of decision tree and naïve Bayesian classifier. The proposed approach scales up the balance detections for different attack types and keeps the false positives at acceptable level in intrusion detection. In complex and dynamic large intrusion detection dataset, the detection accuracy of naïve Bayesian classifier does not scale up as well as decision tree. It has been successfully tested in other problem domains that naïve Bayesian tree improves the classification rates in large dataset. In naïve Bayesian tree nodes contain and split as regular decision-trees, but the leaves contain naïve Bayesian classifiers. The experimental results on KDD99 benchmark network intrusion detection dataset demonstrate that this new approach scales up the detection rates for different attack types and reduces false positives in network intrusion detection.Keywords: Detection rates, false positives, network intrusiondetection, naïve Bayesian tree.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22833758 A 3D Approach for Extraction of the Coronaryartery and Quantification of the Stenosis
Authors: Mahdi Mazinani, S. D. Qanadli, Rahil Hosseini, Tim Ellis, Jamshid Dehmeshki
Abstract:
Segmentation and quantification of stenosis is an important task in assessing coronary artery disease. One of the main challenges is measuring the real diameter of curved vessels. Moreover, uncertainty in segmentation of different tissues in the narrow vessel is an important issue that affects accuracy. This paper proposes an algorithm to extract coronary arteries and measure the degree of stenosis. Markovian fuzzy clustering method is applied to model uncertainty arises from partial volume effect problem. The algorithm employs: segmentation, centreline extraction, estimation of orthogonal plane to centreline, measurement of the degree of stenosis. To evaluate the accuracy and reproducibility, the approach has been applied to a vascular phantom and the results are compared with real diameter. The results of 10 patient datasets have been visually judged by a qualified radiologist. The results reveal the superiority of the proposed method compared to the Conventional thresholding Method (CTM) on both datasets.Keywords: 3D coronary artery tree extraction, segmentation, quantification, fuzzy clustering, and Markov random field
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15843757 The Evaluation of a Cardiac Index Derived from Anthropometric and Biochemical Parameters in Pediatric Morbid Obesity and Metabolic Syndrome
Authors: Mustafa M. Donma
Abstract:
Metabolic syndrome (MetS) components are noteworthy among children with obesity and morbid obesity, because they point out the cases with MetS, which have the great tendency to severe health problems such as cardiovascular diseases both in childhood and adulthood. In clinical practice, considerable efforts are being observed to bring into the open the striking differences between morbid obese cases and those with MetS findings. The most privileged aspect is concerning cardiometabolic features. The aim of this study was to derive an index, which behaves different in children with and without MetS from the cardiac point of view. For the purpose, aspartate transaminase (AST), a cardiac enzyme still being used independently to predict cardiac-related problems was used. 124 children were recruited from the outpatient clinic of Department of Pediatrics in Tekirdag Namik Kemal University, Faculty of Medicine. 43 children with normal body mass index, 41 and 40 morbid obese (MO) children with MetS and without the characteristic features of MetS, respectively, were included in the study. Weight, height, waist circumference (WC), hip circumference (HC), head circumference (HdC), neck circumference (NC), systolic and diastolic blood pressure values were measured and recorded. Body mass index and anthropometric ratios were calculated. Fasting blood glucose (FBG), insulin (INS), triglycerides (TRG), high density lipoprotein cholesterol (HDL-C) analyses were performed. The values for AST, alanine transaminase (ALT) and AST/ALT were obtained. Advanced Donma cardiac index (ADCI) values were calculated. Statistical evaluations including correlation analysis were done by a statistical package program. The statistical significance degree was accepted as p < 0.05. The index, ADCI, was developed from both anthropometric and biochemical parameters. All anthropometric measurements except weight were included in the equation. Besides all biochemical parameters concerning MetS components were also added. This index was tested in each of three groups. Its performance was compared with the performance of cardiometabolic index (CMI). It was also checked whether it was compatible with AST activity. The performance of ADCI was better than that of CMI. Instead of double increase, the increase of three times was observed in children with MetS compared to MO children. The index was correlated with AST in MO group and with AST/ALT in MetS group. In conclusion, this index was superior in discovering cardiac problems in MO and in diagnosing MetS in MetS groups. It was also arbiter to point out cardiovascular and MetS aspects among the groups.
Keywords: Aspartate transaminase, cardiac index, metabolic syndrome, obesity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 873756 Adequacy of Object-Oriented Framework System-Based Testing Techniques
Authors: Jehad Al Dallal
Abstract:
An application framework provides a reusable design and implementation for a family of software systems. If the framework contains defects, the defects will be passed on to the applications developed from the framework. Framework defects are hard to discover at the time the framework is instantiated. Therefore, it is important to remove all defects before instantiating the framework. In this paper, two measures for the adequacy of an object-oriented system-based testing technique are introduced. The measures assess the usefulness and uniqueness of the testing technique. The two measures are applied to experimentally compare the adequacy of two testing techniques introduced to test objectoriented frameworks at the system level. The two considered testing techniques are the New Framework Test Approach and Testing Frameworks Through Hooks (TFTH). The techniques are also compared analytically in terms of their coverage power of objectoriented aspects. The comparison study results show that the TFTH technique is better than the New Framework Test Approach in terms of usefulness degree, uniqueness degree, and coverage power.Keywords: Object-oriented framework, object-oriented framework testing, test case generation, testing adequacy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14313755 Complex Network Approach to International Trade of Fossil Fuel
Authors: Semanur Soyyiğit Kaya, Ercan Eren
Abstract:
Energy has a prominent role for development of nations. Countries which have energy resources also have strategic power in the international trade of energy since it is essential for all stages of production in the economy. Thus, it is important for countries to analyze the weaknesses and strength of the system. On the other side, international trade is one of the fields that are analyzed as a complex network via network analysis. Complex network is one of the tools to analyze complex systems with heterogeneous agents and interaction between them. A complex network consists of nodes and the interactions between these nodes. Total properties which emerge as a result of these interactions are distinct from the sum of small parts (more or less) in complex systems. Thus, standard approaches to international trade are superficial to analyze these systems. Network analysis provides a new approach to analyze international trade as a network. In this network, countries constitute nodes and trade relations (export or import) constitute edges. It becomes possible to analyze international trade network in terms of high degree indicators which are specific to complex networks such as connectivity, clustering, assortativity/disassortativity, centrality, etc. In this analysis, international trade of crude oil and coal which are types of fossil fuel has been analyzed from 2005 to 2014 via network analysis. First, it has been analyzed in terms of some topological parameters such as density, transitivity, clustering etc. Afterwards, fitness to Pareto distribution has been analyzed via Kolmogorov-Smirnov test. Finally, weighted HITS algorithm has been applied to the data as a centrality measure to determine the real prominence of countries in these trade networks. Weighted HITS algorithm is a strong tool to analyze the network by ranking countries with regards to prominence of their trade partners. We have calculated both an export centrality and an import centrality by applying w-HITS algorithm to the data. As a result, impacts of the trading countries have been presented in terms of high-degree indicators.Keywords: Complex network approach, fossil fuel, international trade, network theory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23883754 Dependability Tools in Multi-Agent Support for Failures Analysis of Computer Networks
Authors: Myriam Noureddine
Abstract:
During their activity, all systems must be operational without failures and in this context, the dependability concept is essential avoiding disruption of their function. As computer networks are systems with the same requirements of dependability, this article deals with an analysis of failures for a computer network. The proposed approach integrates specific tools of the plat-form KB3, usually applied in dependability studies of industrial systems. The methodology is supported by a multi-agent system formed by six agents grouped in three meta agents, dealing with two levels. The first level concerns a modeling step through a conceptual agent and a generating agent. The conceptual agent is dedicated to the building of the knowledge base from the system specifications written in the FIGARO language. The generating agent allows producing automatically both the structural model and a dependability model of the system. The second level, the simulation, shows the effects of the failures of the system through a simulation agent. The approach validation is obtained by its application on a specific computer network, giving an analysis of failures through their effects for the considered network.
Keywords: Computer network, dependability, KB3 plat-form, multi-agent system, failure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6413753 A Practical Approach for Electricity Load Forecasting
Authors: T. Rashid, T. Kechadi
Abstract:
This paper is a continuation of our daily energy peak load forecasting approach using our modified network which is part of the recurrent networks family and is called feed forward and feed back multi context artificial neural network (FFFB-MCANN). The inputs to the network were exogenous variables such as the previous and current change in the weather components, the previous and current status of the day and endogenous variables such as the past change in the loads. Endogenous variable such as the current change in the loads were used on the network output. Experiment shows that using endogenous and exogenous variables as inputs to the FFFBMCANN rather than either exogenous or endogenous variables as inputs to the same network produces better results. Experiments show that using the change in variables such as weather components and the change in the past load as inputs to the FFFB-MCANN rather than the absolute values for the weather components and past load as inputs to the same network has a dramatic impact and produce better accuracy.
Keywords: Daily peak load forecasting, feed forward and feedback multi-context neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18553752 Forensic Speaker Verification in Noisy Environmental by Enhancing the Speech Signal Using ICA Approach
Authors: Ahmed Kamil Hasan Al-Ali, Bouchra Senadji, Ganesh Naik
Abstract:
We propose a system to real environmental noise and channel mismatch for forensic speaker verification systems. This method is based on suppressing various types of real environmental noise by using independent component analysis (ICA) algorithm. The enhanced speech signal is applied to mel frequency cepstral coefficients (MFCC) or MFCC feature warping to extract the essential characteristics of the speech signal. Channel effects are reduced using an intermediate vector (i-vector) and probabilistic linear discriminant analysis (PLDA) approach for classification. The proposed algorithm is evaluated by using an Australian forensic voice comparison database, combined with car, street and home noises from QUT-NOISE at a signal to noise ratio (SNR) ranging from -10 dB to 10 dB. Experimental results indicate that the MFCC feature warping-ICA achieves a reduction in equal error rate about (48.22%, 44.66%, and 50.07%) over using MFCC feature warping when the test speech signals are corrupted with random sessions of street, car, and home noises at -10 dB SNR.Keywords: Noisy forensic speaker verification, ICA algorithm, MFCC, MFCC feature warping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9923751 Airfoils Aerodynamic Efficiency Study in Heavy Rain via Two Phase Flow Approach
Authors: M. Ismail, Cao Yihua, Zhao Ming
Abstract:
Heavy rainfall greatly affects the aerodynamic performance of the aircraft. There are many accidents of aircraft caused by aerodynamic efficiency degradation by heavy rain. In this Paper we have studied the heavy rain effects on the aerodynamic efficiency of NACA 64-210 & NACA 0012 airfoils. For our analysis, CFD method and preprocessing grid generator are used as our main analytical tools, and the simulation of rain is accomplished via two phase flow approach-s Discrete Phase Model (DPM). Raindrops are assumed to be non-interacting, non-deforming, non-evaporating and non-spinning spheres. Both airfoil sections exhibited significant reduction in lift and increase in drag for a given lift condition in simulated rain. The most significant difference between these two airfoils was the sensitivity of the NACA 64-210 to liquid water content (LWC), while NACA 0012 performance losses in the rain environment is not a function of LWC . It is expected that the quantitative information gained in this paper will be useful to the operational airline industry and greater effort such as small scale and full scale flight tests should put in this direction to further improve aviation safety.
Keywords: airfoil, discrete phase modeling, heavy rain, Reynolds number
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36463750 Optical 3D-Surface Reconstruction of Weak Textured Objects Based on an Approach of Disparity Stereo Inspection
Authors: Thomas Kerstein, Martin Laurowski, Philipp Klein, Michael Weyrich, Hubert Roth, Jürgen Wahrburg
Abstract:
Optical 3D measurement of objects is meaningful in numerous industrial applications. In various cases shape acquisition of weak textured objects is essential. Examples are repetition parts made of plastic or ceramic such as housing parts or ceramic bottles as well as agricultural products like tubers. These parts are often conveyed in a wobbling way during the automated optical inspection. Thus, conventional 3D shape acquisition methods like laser scanning might fail. In this paper, a novel approach for acquiring 3D shape of weak textured and moving objects is presented. To facilitate such measurements an active stereo vision system with structured light is proposed. The system consists of multiple camera pairs and auxiliary laser pattern generators. It performs the shape acquisition within one shot and is beneficial for rapid inspection tasks. An experimental setup including hardware and software has been developed and implemented.Keywords: automated optical inspection, depth from structured light, stereo vision, surface reconstruction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18453749 Unsupervised Outlier Detection in Streaming Data Using Weighted Clustering
Authors: Yogita, Durga Toshniwal
Abstract:
Outlier detection in streaming data is very challenging because streaming data cannot be scanned multiple times and also new concepts may keep evolving. Irrelevant attributes can be termed as noisy attributes and such attributes further magnify the challenge of working with data streams. In this paper, we propose an unsupervised outlier detection scheme for streaming data. This scheme is based on clustering as clustering is an unsupervised data mining task and it does not require labeled data, both density based and partitioning clustering are combined for outlier detection. In this scheme partitioning clustering is also used to assign weights to attributes depending upon their respective relevance and weights are adaptive. Weighted attributes are helpful to reduce or remove the effect of noisy attributes. Keeping in view the challenges of streaming data, the proposed scheme is incremental and adaptive to concept evolution. Experimental results on synthetic and real world data sets show that our proposed approach outperforms other existing approach (CORM) in terms of outlier detection rate, false alarm rate, and increasing percentages of outliers.
Keywords: Concept Evolution, Irrelevant Attributes, Streaming Data, Unsupervised Outlier Detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26403748 Design Standardization in Aramco: Strategic Analysis
Authors: Mujahid S. Alharbi
Abstract:
The construction of process plants in oil and gas-producing countries, such as Saudi Arabia, necessitates substantial investment in design and building. Each new plant, while unique, includes common building types, suggesting an opportunity for design standardization. This study investigates the adoption of standardized Issue for Construction (IFC) packages for non-process buildings in Saudi Aramco. A SWOT analysis presents the strengths, weaknesses, opportunities, and threats of this approach. The approach's benefits are illustrated using the Hawiyah Unayzah Gas Reservoir Storage Program (HUGRSP) as a case study. Standardization not only offers significant cost savings and operational efficiencies, but also expedites project timelines, reduces the potential for change orders, and fosters local economic growth by allocating building tasks to local contractors. Standardization also improves project management by easing interface constraints between different contractors and promoting adaptability to future industry changes. This research underscores the standardization of non-process buildings as a powerful strategy for cost optimization, efficiency enhancement, and local economic development in process plant construction within the oil and gas sector.
Keywords: Building, construction, management, project, standardization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 673747 Automated Process Quality Monitoring with Prediction of Fault Condition Using Measurement Data
Authors: Hyun-Woo Cho
Abstract:
Detection of incipient abnormal events is important to improve safety and reliability of machine operations and reduce losses caused by failures. Improper set-ups or aligning of parts often leads to severe problems in many machines. The construction of prediction models for predicting faulty conditions is quite essential in making decisions on when to perform machine maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of machine measurement data. The calibration model is used to predict two faulty conditions from historical reference data. This approach utilizes genetic algorithms (GA) based variable selection, and we evaluate the predictive performance of several prediction methods using real data. The results shows that the calibration model based on supervised probabilistic principal component analysis (SPPCA) yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.Keywords: Prediction, operation monitoring, on-line data, nonlinear statistical methods, empirical model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16593746 Streamwise Conduction of Nanofluidic Flow in Microchannels
Authors: Yew Mun Hung, Ching Sze Lim, Tiew Wei Ting, Ningqun Guo
Abstract:
The effect of streamwise conduction on the thermal characteristics of forced convection for nanofluidic flow in rectangular microchannel heat sinks under isothermal wall has been investigated. By applying the fin approach, models with and without streamwise conduction term in the energy equation were developed for hydrodynamically and thermally fully-developed flow. These two models were solved to obtain closed form analytical solutions for the nanofluid and solid wall temperature distributions and the analysis emphasized details of the variations induced by the streamwise conduction on the nanofluid heat transport characteristics. The effects of the Peclet number, nanoparticle volume fraction, thermal conductivity ratio on the thermal characteristics of forced convection in microchannel heat sinks are analyzed. Due to the anomalous increase in the effective thermal conductivity of nanofluid compared to its base fluid, the effect of streamwise conduction is expected to be more significant. This study reveals the significance of the effect of streamwise conduction under certain conditions of which the streamwise conduction should not be neglected in the forced convective heat transfer analysis of microchannel heat sinks.Keywords: fin approach, microchannel heat sink, nanofluid, streamwise conduction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17413745 Person Identification by Using AR Model for EEG Signals
Authors: Gelareh Mohammadi, Parisa Shoushtari, Behnam Molaee Ardekani, Mohammad B. Shamsollahi
Abstract:
A direct connection between ElectroEncephaloGram (EEG) and the genetic information of individuals has been investigated by neurophysiologists and psychiatrists since 1960-s; and it opens a new research area in the science. This paper focuses on the person identification based on feature extracted from the EEG which can show a direct connection between EEG and the genetic information of subjects. In this work the full EO EEG signal of healthy individuals are estimated by an autoregressive (AR) model and the AR parameters are extracted as features. Here for feature vector constitution, two methods have been proposed; in the first method the extracted parameters of each channel are used as a feature vector in the classification step which employs a competitive neural network and in the second method a combination of different channel parameters are used as a feature vector. Correct classification scores at the range of 80% to 100% reveal the potential of our approach for person classification/identification and are in agreement to the previous researches showing evidence that the EEG signal carries genetic information. The novelty of this work is in the combination of AR parameters and the network type (competitive network) that we have used. A comparison between the first and the second approach imply preference of the second one.Keywords: Person Identification, Autoregressive Model, EEG, Neural Network
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17413744 Semi-Automatic Analyzer to Detect Authorial Intentions in Scientific Documents
Authors: Kanso Hassan, Elhore Ali, Soule-dupuy Chantal, Tazi Said
Abstract:
Information Retrieval has the objective of studying models and the realization of systems allowing a user to find the relevant documents adapted to his need of information. The information search is a problem which remains difficult because the difficulty in the representing and to treat the natural languages such as polysemia. Intentional Structures promise to be a new paradigm to extend the existing documents structures and to enhance the different phases of documents process such as creation, editing, search and retrieval. The intention recognition of the author-s of texts can reduce the largeness of this problem. In this article, we present intentions recognition system is based on a semi-automatic method of extraction the intentional information starting from a corpus of text. This system is also able to update the ontology of intentions for the enrichment of the knowledge base containing all possible intentions of a domain. This approach uses the construction of a semi-formal ontology which considered as the conceptualization of the intentional information contained in a text. An experiments on scientific publications in the field of computer science was considered to validate this approach.Keywords: Information research, text analyzes, intentionalstructure, segmentation, ontology, natural language processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16403743 Eco-Innovation as a New Sustainable Development Strategy: Case Studies
Authors: Orhan Çoban, Nuryağdı Rozıyev, Fehmi Karasioğlu
Abstract:
Sustainable development is one of the most debated issues, recently. In terms of providing more livable Earth continuity, while Production activities are going on, on the other hand protecting the environment has importance. As a strategy for sustainable development, eco-innovation is the application of innovations to reduce environmental burdens. Endeavors to understand ecoinnovation processes have been affected from environmental economics and innovation economics from neoclassical economics, and evolutionary economics other than neoclassical economics. In the light of case study analyses, this study aims to display activities in this field through case studies after explaining the theoretical framework of eco-innovations. This study consists of five sections including introduction and conclusion. In the second part of the study identifications of the concepts related with eco-innovation are described and eco-innovations are classified. Third section considers neoclassical and evolutionary approaches from neoclassical economics and evolutionary economics, respectively. Fourth section gives the case studies of successful eco-innovations. Last section is the conclusion part and offers suggestions for future eco-innovation research according to the theoretical framework and the case studies.Keywords: Sustainable Development, Innovation, Ecoinnovation, Neoclassical Approach, Evolutionary Approach, Case Studies
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20513742 A New Face Detection Technique using 2D DCT and Self Organizing Feature Map
Authors: Abdallah S. Abdallah, A. Lynn Abbott, Mohamad Abou El-Nasr
Abstract:
This paper presents a new technique for detection of human faces within color images. The approach relies on image segmentation based on skin color, features extracted from the two-dimensional discrete cosine transform (DCT), and self-organizing maps (SOM). After candidate skin regions are extracted, feature vectors are constructed using DCT coefficients computed from those regions. A supervised SOM training session is used to cluster feature vectors into groups, and to assign “face" or “non-face" labels to those clusters. Evaluation was performed using a new image database of 286 images, containing 1027 faces. After training, our detection technique achieved a detection rate of 77.94% during subsequent tests, with a false positive rate of 5.14%. To our knowledge, the proposed technique is the first to combine DCT-based feature extraction with a SOM for detecting human faces within color images. It is also one of a few attempts to combine a feature-invariant approach, such as color-based skin segmentation, together with appearance-based face detection. The main advantage of the new technique is its low computational requirements, in terms of both processing speed and memory utilization.Keywords: Face detection, skin color segmentation, self-organizingmap.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25443741 Data Envelopment Analysis with Partially Perfect Objects
Authors: Alexander Y. Vaninsky
Abstract:
This paper presents a simplified version of Data Envelopment Analysis (DEA) - a conventional approach to evaluating the performance and ranking of competitive objects characterized by two groups of factors acting in opposite directions: inputs and outputs. DEA with a Perfect Object (DEA PO) augments the group of actual objects with a virtual Perfect Object - the one having greatest outputs and smallest inputs. It allows for obtaining an explicit analytical solution and making a step to an absolute efficiency. This paper develops this approach further and introduces a DEA model with Partially Perfect Objects. DEA PPO consecutively eliminates the smallest relative inputs or greatest relative outputs, and applies DEA PO to the reduced collections of indicators. The partial efficiency scores are combined to get the weighted efficiency score. The computational scheme remains simple, like that of DEA PO, but the advantage of the DEA PPO is taking into account all of the inputs and outputs for each actual object. Firm evaluation is considered as an example.
Keywords: Data Envelopment Analysis, Perfect object, Partially perfect object, Partial efficiency, Explicit solution, Simplified algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16973740 Teaching Material, Books, Publications versus the Practice: Myths and Truths about Installation and Use of Downhole Safety Valve
Authors: Robson da Cunha Santos, Caio Cezar R. Bonifacio, Diego Mureb Quesada, Gerson Gomes Cunha
Abstract:
The paper is related to the safety of oil wells and environmental preservation on the planet, because they require great attention and commitment from oil companies and people who work with these equipments. This must occur from drilling the well until it is abandoned in order to safeguard the environment and prevent possible damage. The project had as main objective the constitution resulting from comparatives made among books, articles and publications with information gathered in technical visits to operational bases of Petrobras. After the visits, the information from methods of utilization and present managements, which were not available before, became available to the general audience. As a result, it is observed a huge flux of incorrect and out-of-date information that comprehends not only bibliographic archives, but also academic resources and materials. During the gathering of more in-depth information on the manufacturing, assembling, and use aspects of DHSVs, several issues that were previously known as correct, customary issues were discovered to be uncertain and outdated. Information of great importance resulted in affirmations about subjects as the depth of the valve installation that was before installed to 30 meters from the seabed (mud line). Despite this, the installation should vary in conformity to the ideal depth to escape from area with the biggest tendency to hydrates formation according to the temperature and pressure. Regarding to valves with nitrogen chamber, in accordance with books, they have their utilization linked to water line ≥ 700 meters, but in Brazilian exploratory fields, their use occurs from 600 meters of water line. The valves used in Brazilian fields are able to be inserted to the production column and self-equalizing, but the use of screwed valve in the column of production and equalizing is predominant. Although these valves are more expensive to acquire, they are more reliable, efficient, with a bigger shelf life and they do not cause restriction to the fluid flux. It follows that based on researches and theoretical information confronted to usual forms used in fields, the present project is important and relevant. This project will be used as source of actualization and information equalization that connects academic environment and real situations in exploratory situations and also taking into consideration the enrichment of precise and easy to understand information to future researches and academic upgrading.Keywords: Downhole, Teaching Material, Books, Practice.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13513739 Logistic Model Tree and Expectation-Maximization for Pollen Recognition and Grouping
Authors: Endrick Barnacin, Jean-Luc Henry, Jack Molinié, Jimmy Nagau, Hélène Delatte, Gérard Lebreton
Abstract:
Palynology is a field of interest for many disciplines. It has multiple applications such as chronological dating, climatology, allergy treatment, and even honey characterization. Unfortunately, the analysis of a pollen slide is a complicated and time-consuming task that requires the intervention of experts in the field, which is becoming increasingly rare due to economic and social conditions. So, the automation of this task is a necessity. Pollen slides analysis is mainly a visual process as it is carried out with the naked eye. That is the reason why a primary method to automate palynology is the use of digital image processing. This method presents the lowest cost and has relatively good accuracy in pollen retrieval. In this work, we propose a system combining recognition and grouping of pollen. It consists of using a Logistic Model Tree to classify pollen already known by the proposed system while detecting any unknown species. Then, the unknown pollen species are divided using a cluster-based approach. Success rates for the recognition of known species have been achieved, and automated clustering seems to be a promising approach.
Keywords: Pollen recognition, logistic model tree, expectation-maximization, local binary pattern.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7773738 Near-Field Robust Adaptive Beamforming Based on Worst-Case Performance Optimization
Authors: Jing-ran Lin, Qi-cong Peng, Huai-zong Shao
Abstract:
The performance of adaptive beamforming degrades substantially in the presence of steering vector mismatches. This degradation is especially severe in the near-field, for the 3-dimensional source location is more difficult to estimate than the 2-dimensional direction of arrival in far-field cases. As a solution, a novel approach of near-field robust adaptive beamforming (RABF) is proposed in this paper. It is a natural extension of the traditional far-field RABF and belongs to the class of diagonal loading approaches, with the loading level determined based on worst-case performance optimization. However, different from the methods solving the optimal loading by iteration, it suggests here a simple closed-form solution after some approximations, and consequently, the optimal weight vector can be expressed in a closed form. Besides simplicity and low computational cost, the proposed approach reveals how different factors affect the optimal loading as well as the weight vector. Its excellent performance in the near-field is confirmed via a number of numerical examples.Keywords: Robust adaptive beamforming (RABF), near-field, steering vector mismatches, diagonal loading, worst-case performanceoptimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18823737 Genetic-Based Multi Resolution Noisy Color Image Segmentation
Authors: Raghad Jawad Ahmed
Abstract:
Segmentation of a color image composed of different kinds of regions can be a hard problem, namely to compute for an exact texture fields. The decision of the optimum number of segmentation areas in an image when it contains similar and/or un stationary texture fields. A novel neighborhood-based segmentation approach is proposed. A genetic algorithm is used in the proposed segment-pass optimization process. In this pass, an energy function, which is defined based on Markov Random Fields, is minimized. In this paper we use an adaptive threshold estimation method for image thresholding in the wavelet domain based on the generalized Gaussian distribution (GGD) modeling of sub band coefficients. This method called Normal Shrink is computationally more efficient and adaptive because the parameters required for estimating the threshold depend on sub band data energy that used in the pre-stage of segmentation. A quad tree is employed to implement the multi resolution framework, which enables the use of different strategies at different resolution levels, and hence, the computation can be accelerated. The experimental results using the proposed segmentation approach are very encouraging.Keywords: Color image segmentation, Genetic algorithm, Markov random field, Scale space filter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15793736 New Approach for Minimizing Wavelength Fragmentation in Wavelength-Routed WDM Networks
Authors: Sami Baraketi, Jean-Marie Garcia, Olivier Brun
Abstract:
Wavelength Division Multiplexing (WDM) is the dominant transport technology used in numerous high capacity backbone networks, based on optical infrastructures. Given the importance of costs (CapEx and OpEx) associated to these networks, resource management is becoming increasingly important, especially how the optical circuits, called “lightpaths”, are routed throughout the network. This requires the use of efficient algorithms which provide routing strategies with the lowest cost. We focus on the lightpath routing and wavelength assignment problem, known as the RWA problem, while optimizing wavelength fragmentation over the network. Wavelength fragmentation poses a serious challenge for network operators since it leads to the misuse of the wavelength spectrum, and then to the refusal of new lightpath requests. In this paper, we first establish a new Integer Linear Program (ILP) for the problem based on a node-link formulation. This formulation is based on a multilayer approach where the original network is decomposed into several network layers, each corresponding to a wavelength. Furthermore, we propose an efficient heuristic for the problem based on a greedy algorithm followed by a post-treatment procedure. The obtained results show that the optimal solution is often reached. We also compare our results with those of other RWA heuristic methods
Keywords: WDM, lightpath, RWA, wavelength fragmentation, optimization, linear programming, heuristic
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1871