Search results for: Kernel fuzzy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 935

Search results for: Kernel fuzzy

155 An Improved C-Means Model for MRI Segmentation

Authors: Ying Shen, Weihua Zhu

Abstract:

Medical images are important to help identifying different diseases, for example, Magnetic resonance imaging (MRI) can be used to investigate the brain, spinal cord, bones, joints, breasts, blood vessels, and heart. Image segmentation, in medical image analysis, is usually the first step to find out some characteristics with similar color, intensity or texture so that the diagnosis could be further carried out based on these features. This paper introduces an improved C-means model to segment the MRI images. The model is based on information entropy to evaluate the segmentation results by achieving global optimization. Several contributions are significant. Firstly, Genetic Algorithm (GA) is used for achieving global optimization in this model where fuzzy C-means clustering algorithm (FCMA) is not capable of doing that. Secondly, the information entropy after segmentation is used for measuring the effectiveness of MRI image processing. Experimental results show the outperformance of the proposed model by comparing with traditional approaches.

Keywords: magnetic resonance image (MRI), c-means model, image segmentation, information entropy

Procedia PDF Downloads 206
154 Parking Space Detection and Trajectory Tracking Control for Vehicle Auto-Parking

Authors: Shiuh-Jer Huang, Yu-Sheng Hsu

Abstract:

On-board available parking space detecting system, parking trajectory planning and tracking control mechanism are the key components of vehicle backward auto-parking system. Firstly, pair of ultrasonic sensors is installed on each side of vehicle body surface to detect the relative distance between ego-car and surrounding obstacle. The dimension of a found empty space can be calculated based on vehicle speed and the time history of ultrasonic sensor detecting information. This result can be used for constructing the 2D vehicle environmental map and available parking type judgment. Finally, the auto-parking controller executes the on-line optimal parking trajectory planning based on this 2D environmental map, and monitors the real-time vehicle parking trajectory tracking control. This low cost auto-parking system was tested on a model car.

Keywords: vehicle auto-parking, parking space detection, parking path tracking control, intelligent fuzzy controller

Procedia PDF Downloads 220
153 Presenting a Job Scheduling Algorithm Based on Learning Automata in Computational Grid

Authors: Roshanak Khodabakhsh Jolfaei, Javad Akbari Torkestani

Abstract:

As a cooperative environment for problem-solving, it is necessary that grids develop efficient job scheduling patterns with regard to their goals, domains and structure. Since the Grid environments facilitate distributed calculations, job scheduling appears in the form of a critical problem for the management of Grid sources that influences severely on the efficiency for the whole Grid environment. Due to the existence of some specifications such as sources dynamicity and conditions of the network in Grid, some algorithm should be presented to be adjustable and scalable with increasing the network growth. For this purpose, in this paper a job scheduling algorithm has been presented on the basis of learning automata in computational Grid which the performance of its results were compared with FPSO algorithm (Fuzzy Particle Swarm Optimization algorithm) and GJS algorithm (Grid Job Scheduling algorithm). The obtained numerical results indicated the superiority of suggested algorithm in comparison with FPSO and GJS. In addition, the obtained results classified FPSO and GJS in the second and third position respectively after the mentioned algorithm.

Keywords: computational grid, job scheduling, learning automata, dynamic scheduling

Procedia PDF Downloads 318
152 Comparative Rumen Degradable and Rumen Undegradable Fractions in Untreated, Formaldehyde and Heat Treated Vegetable Protein Sources of Pakistan

Authors: Illahi Bakhsh Marghazani, Nasrullah, Masood Ul Haq Kakar, Abdul Hameed Baloch, Ahmad Nawaz Khoso, Behram Chacher

Abstract:

Protein sources are the major part of ration fed to dairy buffaloes in Pakistan however, the limited availability and lack of judicious use of protein resources are further aggravating the conditions to enhance milk and meat production. In order to gain maximum production from limited protein source availability, it is necessary to balance feed for rumen degradable and rumen undegradable protein fractions. This study planned to know the rumen degradable and rumen undegradable fractions in all vegetable protein sources with (formaldehyde and heat treatment) and without treatments. Samples of soybean meal, corn gluten meal 60%, maize gluten feed, guar meal, sunflower meal, rapeseed meal, rapeseed cake, canola meal, cottonseed cake, cottonseed meal, coconut cake, coconut meal, palm kernel cake, almond cake and sesame cake were collected from ten different geographical locations of Pakistan. These samples were also subjected to formaldehyde (1% /100g CP of test feed) and heat treatments (1 hr at 15 lb psi/100 g CP of test feed). In situ technique was used to know the ruminal degradability characteristics. Data obtained were fitted to Orskove equation. Results showed that both treatments significantly (P < 0.05) decreased ruminal degradability in all vegetable protein sources than untreated vegetable protein sources, however, of both treatments, heat treatment was more effective than formaldehyde treatment in decreasing ruminal degradability in most of the studied vegetable protein sources.

Keywords: formaldehyde and heat treatments, in situ technique, rumen degradable and rumen undegradable fractions, vegetable protein sources

Procedia PDF Downloads 305
151 Improvement of Transient Voltage Response Using PSS-SVC Coordination Based on ANFIS-Algorithm in a Three-Bus Power System

Authors: I Made Ginarsa, Agung Budi Muljono, I Made Ari Nrartha

Abstract:

Transient voltage response appears in power system operation when an additional loading is forced to load bus of power systems. In this research, improvement of transient voltage response is done by using power system stabilizer-static var compensator (PSS-SVC) based on adaptive neuro-fuzzy inference system (ANFIS)-algorithm. The main function of the PSS is to add damping component to damp rotor oscillation through automatic voltage regulator (AVR) and excitation system. Learning process of the ANFIS is done by using off-line method where data learning that is used to train the ANFIS model are obtained by simulating the PSS-SVC conventional. The ANFIS model uses 7 Gaussian membership functions at two inputs and 49 rules at an output. Then, the ANFIS-PSS and ANFIS-SVC models are applied to power systems. Simulation result shows that the response of transient voltage is improved with settling time at the time of 4.25 s.

Keywords: improvement, transient voltage, PSS-SVC, ANFIS, settling time

Procedia PDF Downloads 547
150 Automatic Facial Skin Segmentation Using Possibilistic C-Means Algorithm for Evaluation of Facial Surgeries

Authors: Elham Alaee, Mousa Shamsi, Hossein Ahmadi, Soroosh Nazem, Mohammad Hossein Sedaaghi

Abstract:

Human face has a fundamental role in the appearance of individuals. So the importance of facial surgeries is undeniable. Thus, there is a need for the appropriate and accurate facial skin segmentation in order to extract different features. Since Fuzzy C-Means (FCM) clustering algorithm doesn’t work appropriately for noisy images and outliers, in this paper we exploit Possibilistic C-Means (PCM) algorithm in order to segment the facial skin. For this purpose, first, we convert facial images from RGB to YCbCr color space. To evaluate performance of the proposed algorithm, the database of Sahand University of Technology, Tabriz, Iran was used. In order to have a better understanding from the proposed algorithm; FCM and Expectation-Maximization (EM) algorithms are also used for facial skin segmentation. The proposed method shows better results than the other segmentation methods. Results include misclassification error (0.032) and the region’s area error (0.045) for the proposed algorithm.

Keywords: facial image, segmentation, PCM, FCM, skin error, facial surgery

Procedia PDF Downloads 558
149 Decision Support System for a Pilot Flash Flood Early Warning System in Central Chile

Authors: D. Pinto, L. Castro, M. L. Cruzat, S. Barros, J. Gironás, C. Oberli, M. Torres, C. Escauriaza, A. Cipriano

Abstract:

Flash floods, together with landslides, are a common natural threat for people living in mountainous regions and foothills. One way to deal with this constant menace is the use of Early Warning Systems, which have become a very important mitigation strategy for natural disasters. In this work, we present our proposal for a pilot Flash Flood Early Warning System for Santiago, Chile, the first stage of a more ambitious project that in a future stage shall also include early warning of landslides. To give a context for our approach, we first analyze three existing Flash Flood Early Warning Systems, focusing on their general architectures. We then present our proposed system, with main focus on the decision support system, a system that integrates empirical models and fuzzy expert systems to achieve reliable risk estimations.

Keywords: decision support systems, early warning systems, flash flood, natural hazard

Procedia PDF Downloads 341
148 Power Aware Modified I-LEACH Protocol Using Fuzzy IF Then Rules

Authors: Gagandeep Singh, Navdeep Singh

Abstract:

Due to limited battery of sensor nodes, so energy efficiency found to be main constraint in WSN. Therefore the main focus of the present work is to find the ways to minimize the energy consumption problem and will results; enhancement in the network stability period and life time. Many researchers have proposed different kind of the protocols to enhance the network lifetime further. This paper has evaluated the issues which have been neglected in the field of the WSNs. WSNs are composed of multiple unattended ultra-small, limited-power sensor nodes. Sensor nodes are deployed randomly in the area of interest. Sensor nodes have limited processing, wireless communication and power resource capabilities Sensor nodes send sensed data to sink or Base Station (BS). I-LEACH gives adaptive clustering mechanism which very efficiently deals with energy conservations. This paper ends up with the shortcomings of various adaptive clustering based WSNs protocols.

Keywords: WSN, I-Leach, MATLAB, sensor

Procedia PDF Downloads 252
147 Computational Cell Segmentation in Immunohistochemically Image of Meningioma Tumor Using Fuzzy C-Means and Adaptive Vector Directional Filter

Authors: Vahid Anari, Leila Shahmohammadi

Abstract:

Diagnosing and interpreting manually from a large cohort dataset of immunohistochemically stained tissue of tumors using an optical microscope involves subjectivity and also is tedious for pathologist specialists. Moreover, digital pathology today represents more of an evolution than a revolution in pathology. In this paper, we develop and test an unsupervised algorithm that can automatically enhance the IHC image of a meningioma tumor and classify cells into positive (proliferative) and negative (normal) cells. A dataset including 150 images is used to test the scheme. In addition, a new adaptive color image enhancement method is proposed based on a vector directional filter (VDF) and statistical properties of filtering the window. Since the cells are distinguishable by the human eye, the accuracy and stability of the algorithm are quantitatively compared through application to a wide variety of real images.

Keywords: digital pathology, cell segmentation, immunohistochemically, noise reduction

Procedia PDF Downloads 45
146 A Hybrid Data-Handler Module Based Approach for Prioritization in Quality Function Deployment

Authors: P. Venu, Joeju M. Issac

Abstract:

Quality Function Deployment (QFD) is a systematic technique that creates a platform where the customer responses can be positively converted to design attributes. The accuracy of a QFD process heavily depends on the data that it is handling which is captured from customers or QFD team members. Customized computer programs that perform Quality Function Deployment within a stipulated time have been used by various companies across the globe. These programs heavily rely on storage and retrieval of the data on a common database. This database must act as a perfect source with minimum missing values or error values in order perform actual prioritization. This paper introduces a missing/error data handler module which uses Genetic Algorithm and Fuzzy numbers. The prioritization of customer requirements of sesame oil is illustrated and a comparison is made between proposed data handler module-based deployment and manual deployment.

Keywords: hybrid data handler, QFD, prioritization, module-based deployment

Procedia PDF Downloads 271
144 A Statistical Approach to Predict and Classify the Commercial Hatchability of Chickens Using Extrinsic Parameters of Breeders and Eggs

Authors: M. S. Wickramarachchi, L. S. Nawarathna, C. M. B. Dematawewa

Abstract:

Hatchery performance is critical for the profitability of poultry breeder operations. Some extrinsic parameters of eggs and breeders cause to increase or decrease the hatchability. This study aims to identify the affecting extrinsic parameters on the commercial hatchability of local chicken's eggs and determine the most efficient classification model with a hatchability rate greater than 90%. In this study, seven extrinsic parameters were considered: egg weight, moisture loss, breeders age, number of fertilised eggs, shell width, shell length, and shell thickness. Multiple linear regression was performed to determine the most influencing variable on hatchability. First, the correlation between each parameter and hatchability were checked. Then a multiple regression model was developed, and the accuracy of the fitted model was evaluated. Linear Discriminant Analysis (LDA), Classification and Regression Trees (CART), k-Nearest Neighbors (kNN), Support Vector Machines (SVM) with a linear kernel, and Random Forest (RF) algorithms were applied to classify the hatchability. This grouping process was conducted using binary classification techniques. Hatchability was negatively correlated with egg weight, breeders' age, shell width, shell length, and positive correlations were identified with moisture loss, number of fertilised eggs, and shell thickness. Multiple linear regression models were more accurate than single linear models regarding the highest coefficient of determination (R²) with 94% and minimum AIC and BIC values. According to the classification results, RF, CART, and kNN had performed the highest accuracy values 0.99, 0.975, and 0.972, respectively, for the commercial hatchery process. Therefore, the RF is the most appropriate machine learning algorithm for classifying the breeder outcomes, which are economically profitable or not, in a commercial hatchery.

Keywords: classification models, egg weight, fertilised eggs, multiple linear regression

Procedia PDF Downloads 64
143 Solving LWE by Pregressive Pumps and Its Optimization

Authors: Leizhang Wang, Baocang Wang

Abstract:

General Sieve Kernel (G6K) is considered as currently the fastest algorithm for the shortest vector problem (SVP) and record holder of open SVP challenge. We study the lattice basis quality improvement effects of the Workout proposed in G6K, which is composed of a series of pumps to solve SVP. Firstly, we use a low-dimensional pump output basis to propose a predictor to predict the quality of high-dimensional Pumps output basis. Both theoretical analysis and experimental tests are performed to illustrate that it is more computationally expensive to solve the LWE problems by using a G6K default SVP solving strategy (Workout) than these lattice reduction algorithms (e.g. BKZ 2.0, Progressive BKZ, Pump, and Jump BKZ) with sieving as their SVP oracle. Secondly, the default Workout in G6K is optimized to achieve a stronger reduction and lower computational cost. Thirdly, we combine the optimized Workout and the Pump output basis quality predictor to further reduce the computational cost by optimizing LWE instances selection strategy. In fact, we can solve the TU LWE challenge (n = 65, q = 4225, = 0:005) 13.6 times faster than the G6K default Workout. Fourthly, we consider a combined two-stage (Preprocessing by BKZ- and a big Pump) LWE solving strategy. Both stages use dimension for free technology to give new theoretical security estimations of several LWE-based cryptographic schemes. The security estimations show that the securities of these schemes with the conservative Newhope’s core-SVP model are somewhat overestimated. In addition, in the case of LAC scheme, LWE instances selection strategy can be optimized to further improve the LWE-solving efficiency even by 15% and 57%. Finally, some experiments are implemented to examine the effects of our strategies on the Normal Form LWE problems, and the results demonstrate that the combined strategy is four times faster than that of Newhope.

Keywords: LWE, G6K, pump estimator, LWE instances selection strategy, dimension for free

Procedia PDF Downloads 38
142 Interpretation and Clustering Framework for Analyzing ECG Survey Data

Authors: Irum Matloob, Shoab Ahmad Khan, Fahim Arif

Abstract:

As Indo-Pak has been the victim of heart diseases since many decades. Many surveys showed that percentage of cardiac patients is increasing in Pakistan day by day, and special attention is needed to pay on this issue. The framework is proposed for performing detailed analysis of ECG survey data which is conducted for measuring prevalence of heart diseases statistics in Pakistan. The ECG survey data is evaluated or filtered by using automated Minnesota codes and only those ECGs are used for further analysis which is fulfilling the standardized conditions mentioned in the Minnesota codes. Then feature selection is performed by applying proposed algorithm based on discernibility matrix, for selecting relevant features from the database. Clustering is performed for exposing natural clusters from the ECG survey data by applying spectral clustering algorithm using fuzzy c means algorithm. The hidden patterns and interesting relationships which have been exposed after this analysis are useful for further detailed analysis and for many other multiple purposes.

Keywords: arrhythmias, centroids, ECG, clustering, discernibility matrix

Procedia PDF Downloads 444
141 Performance and Emission Prediction in a Biodiesel Engine Fuelled with Honge Methyl Ester Using RBF Neural Networks

Authors: Shiva Kumar, G. S. Vijay, Srinivas Pai P., Shrinivasa Rao B. R.

Abstract:

In the present study RBF neural networks were used for predicting the performance and emission parameters of a biodiesel engine. Engine experiments were carried out in a 4 stroke diesel engine using blends of diesel and Honge methyl ester as the fuel. Performance parameters like BTE, BSEC, Tech and emissions from the engine were measured. These experimental results were used for ANN modeling. RBF center initialization was done by random selection and by using Clustered techniques. Network was trained by using fixed and varying widths for the RBF units. It was observed that RBF results were having a good agreement with the experimental results. Networks trained by using clustering technique gave better results than using random selection of centers in terms of reduced MRE and increased prediction accuracy. The average MRE for the performance parameters was 3.25% with the prediction accuracy of 98% and for emissions it was 10.4% with a prediction accuracy of 80%.

Keywords: radial basis function networks, emissions, performance parameters, fuzzy c means

Procedia PDF Downloads 532
140 Chemical Reaction Algorithm for Expectation Maximization Clustering

Authors: Li Ni, Pen ManMan, Li KenLi

Abstract:

Clustering is an intensive research for some years because of its multifaceted applications, such as biology, information retrieval, medicine, business and so on. The expectation maximization (EM) is a kind of algorithm framework in clustering methods, one of the ten algorithms of machine learning. Traditionally, optimization of objective function has been the standard approach in EM. Hence, research has investigated the utility of evolutionary computing and related techniques in the regard. Chemical Reaction Optimization (CRO) is a recently established method. So the property embedded in CRO is used to solve optimization problems. This paper presents an algorithm framework (EM-CRO) with modified CRO operators based on EM cluster problems. The hybrid algorithm is mainly to solve the problem of initial value sensitivity of the objective function optimization clustering algorithm. Our experiments mainly take the EM classic algorithm:k-means and fuzzy k-means as an example, through the CRO algorithm to optimize its initial value, get K-means-CRO and FKM-CRO algorithm. The experimental results of them show that there is improved efficiency for solving objective function optimization clustering problems.

Keywords: chemical reaction optimization, expection maimization, initia, objective function clustering

Procedia PDF Downloads 687
139 Analysis of ECGs Survey Data by Applying Clustering Algorithm

Authors: Irum Matloob, Shoab Ahmad Khan, Fahim Arif

Abstract:

As Indo-pak has been the victim of heart diseases since many decades. Many surveys showed that percentage of cardiac patients is increasing in Pakistan day by day, and special attention is needed to pay on this issue. The framework is proposed for performing detailed analysis of ECG survey data which is conducted for measuring the prevalence of heart diseases statistics in Pakistan. The ECG survey data is evaluated or filtered by using automated Minnesota codes and only those ECGs are used for further analysis which is fulfilling the standardized conditions mentioned in the Minnesota codes. Then feature selection is performed by applying proposed algorithm based on discernibility matrix, for selecting relevant features from the database. Clustering is performed for exposing natural clusters from the ECG survey data by applying spectral clustering algorithm using fuzzy c means algorithm. The hidden patterns and interesting relationships which have been exposed after this analysis are useful for further detailed analysis and for many other multiple purposes.

Keywords: arrhythmias, centroids, ECG, clustering, discernibility matrix

Procedia PDF Downloads 329
138 Credit Risk Assessment Using Rule Based Classifiers: A Comparative Study

Authors: Salima Smiti, Ines Gasmi, Makram Soui

Abstract:

Credit risk is the most important issue for financial institutions. Its assessment becomes an important task used to predict defaulter customers and classify customers as good or bad payers. To this objective, numerous techniques have been applied for credit risk assessment. However, to our knowledge, several evaluation techniques are black-box models such as neural networks, SVM, etc. They generate applicants’ classes without any explanation. In this paper, we propose to assess credit risk using rules classification method. Our output is a set of rules which describe and explain the decision. To this end, we will compare seven classification algorithms (JRip, Decision Table, OneR, ZeroR, Fuzzy Rule, PART and Genetic programming (GP)) where the goal is to find the best rules satisfying many criteria: accuracy, sensitivity, and specificity. The obtained results confirm the efficiency of the GP algorithm for German and Australian datasets compared to other rule-based techniques to predict the credit risk.

Keywords: credit risk assessment, classification algorithms, data mining, rule extraction

Procedia PDF Downloads 153
137 Classifying Affective States in Virtual Reality Environments Using Physiological Signals

Authors: Apostolos Kalatzis, Ashish Teotia, Vishnunarayan Girishan Prabhu, Laura Stanley

Abstract:

Emotions are functional behaviors influenced by thoughts, stimuli, and other factors that induce neurophysiological changes in the human body. Understanding and classifying emotions are challenging as individuals have varying perceptions of their environments. Therefore, it is crucial that there are publicly available databases and virtual reality (VR) based environments that have been scientifically validated for assessing emotional classification. This study utilized two commercially available VR applications (Guided Meditation VR™ and Richie’s Plank Experience™) to induce acute stress and calm state among participants. Subjective and objective measures were collected to create a validated multimodal dataset and classification scheme for affective state classification. Participants’ subjective measures included the use of the Self-Assessment Manikin, emotional cards and 9 point Visual Analogue Scale for perceived stress, collected using a Virtual Reality Assessment Tool developed by our team. Participants’ objective measures included Electrocardiogram and Respiration data that were collected from 25 participants (15 M, 10 F, Mean = 22.28  4.92). The features extracted from these data included heart rate variability components and respiration rate, both of which were used to train two machine learning models. Subjective responses validated the efficacy of the VR applications in eliciting the two desired affective states; for classifying the affective states, a logistic regression (LR) and a support vector machine (SVM) with a linear kernel algorithm were developed. The LR outperformed the SVM and achieved 93.8%, 96.2%, 93.8% leave one subject out cross-validation accuracy, precision and recall, respectively. The VR assessment tool and data collected in this study are publicly available for other researchers.

Keywords: affective computing, biosignals, machine learning, stress database

Procedia PDF Downloads 112
136 Effect of Cryogenic Pre-stretching on the Room Temperature Tensile Behavior of AZ61 Magnesium Alloy and Dominant Grain Growth Mechanisms During Subsequent Annealing

Authors: Umer Masood Chaudry, Hafiz Muhammad Rehan Tariq, Chung-soo Kim, Tea-sung Jun

Abstract:

This study explored the influence of pre-stretching temperature on the microstructural characteristics and deformation behavior of AZ61 magnesium alloy and its implications on grain growth during subsequent annealing. AZ61 alloy was stretched to 5% plastic strain along rolling (RD) and transverse direction (TD) at room (RT) and cryogenic temperature (-150 oC, CT) followed by annealing at 320 oC for 1 h to investigate the twinning and dislocation evolution and its consequent effect on the flow stress, plastic strain and strain hardening rate. Compared to RT-stretched samples, significant improvement in yield stress, strain hardening rate and moderate reduction in elongation to failure were witnessed for CT-stretched samples along RD and TD. The subsequent EBSD analysis revealed the increased fraction of fine {10-12} twins and nucleation of multiple {10-12} twin variants caused by higher local stress concentration at the grain boundaries in CT-stretched samples as manifested by the kernel average misorientation. This higher twin fraction and twin-twin interaction imposed the strengthening by restricting the mean free path of dislocations, leading to higher flow stress and strain hardening rate. During annealing of the RT/CT-stretched samples, the residual strain energy and twin boundaries were decreased due to static recovery, leading to a coarse-grained twin-free microstructure. Strain induced boundary migration (SBIM) was found to be the predominant mechanism governing the grain growth during annealing via movement of high angle grain boundaries.

Keywords: magnesium, twinning, twinning variant selection, EBSD, cryogenic deformation

Procedia PDF Downloads 47
135 The Appraisal of Construction Sites Productivity: In Kendall’s Concordance

Authors: Abdulkadir Abu Lawal

Abstract:

For the dearth of reliable cardinal numerical data, the linked phenomena in productivity indices such as operational costs and company turnovers, etc. could not be investigated. This would not give us insight to the root of productivity problems at unique sites. So, ordinal ranking by professionals who were most directly involved with construction sites was applied for Kendall’s concordance. Responses gathered from independent architects, builders/engineers, and quantity surveyors were herein analyzed. They were responses based on factors that affect sites productivity, and these factors were categorized as head office factors, resource management effectiveness factors, motivational factors, and training/skill development factors. It was found that productivity is low and has to be improved in order to facilitate Nigerian efforts in bridging its infrastructure deficit. The significance of this work is underlined with the Kendall’s coefficient of concordance of 0.78, while remedial measures must be emphasized to stimulate better productivity. Further detailed study can be undertaken by using Fuzzy logic analysis on wider Delphi survey.

Keywords: factors, Kendall's coefficient of concordance, magnitude of agreement, percentage magnitude of dichotomy, ranking variables

Procedia PDF Downloads 599
134 Formalizing a Procedure for Generating Uncertain Resource Availability Assumptions Based on Real Time Logistic Data Capturing with Auto-ID Systems for Reactive Scheduling

Authors: Lars Laußat, Manfred Helmus, Kamil Szczesny, Markus König

Abstract:

As one result of the project “Reactive Construction Project Scheduling using Real Time Construction Logistic Data and Simulation”, a procedure for using data about uncertain resource availability assumptions in reactive scheduling processes has been developed. Prediction data about resource availability is generated in a formalized way using real-time monitoring data e.g. from auto-ID systems on the construction site and in the supply chains. The paper focuses on the formalization of the procedure for monitoring construction logistic processes, for the detection of disturbance and for generating of new and uncertain scheduling assumptions for the reactive resource constrained simulation procedure that is and will be further described in other papers.

Keywords: auto-ID, construction logistic, fuzzy, monitoring, RFID, scheduling

Procedia PDF Downloads 487
133 Screening of Factors Affecting the Enzymatic Hydrolysis of Empty Fruit Bunches in Aqueous Ionic Liquid and Locally Produced Cellulase System

Authors: Md. Z. Alam, Amal A. Elgharbawy, Muhammad Moniruzzaman, Nassereldeen A. Kabbashi, Parveen Jamal

Abstract:

The enzymatic hydrolysis of lignocellulosic biomass is one of the obstacles in the process of sugar production, due to the presence of lignin that protects the cellulose molecules against cellulases. Although the pretreatment of lignocellulose in ionic liquid (IL) system has been receiving a lot of interest; however, it requires IL removal with an anti-solvent in order to proceed with the enzymatic hydrolysis. At this point, introducing a compatible cellulase enzyme seems more efficient in this process. A cellulase enzyme that was produced by Trichoderma reesei on palm kernel cake (PKC) exhibited a promising stability in several ILs. The enzyme called PKC-Cel was tested for its optimum pH and temperature as well as its molecular weight. One among evaluated ILs, 1,3-diethylimidazolium dimethyl phosphate [DEMIM] DMP was applied in this study. Evaluation of six factors was executed in Stat-Ease Design Expert V.9, definitive screening design, which are IL/ buffer ratio, temperature, hydrolysis retention time, biomass loading, cellulase loading and empty fruit bunches (EFB) particle size. According to the obtained data, IL-enzyme system shows the highest sugar concentration at 70 °C, 27 hours, 10% IL-buffer, 35% biomass loading, 60 Units/g cellulase and 200 μm particle size. As concluded from the obtained data, not only the PKC-Cel was stable in the presence of the IL, also it was actually stable at a higher temperature than its optimum one. The reducing sugar obtained was 53.468±4.58 g/L which was equivalent to 0.3055 g reducing sugar/g EFB. This approach opens an insight for more studies in order to understand the actual effect of ILs on cellulases and their interactions in the aqueous system. It could also benefit in an efficient production of bioethanol from lignocellulosic biomass.

Keywords: cellulase, hydrolysis, lignocellulose, pretreatment

Procedia PDF Downloads 331
132 Risk Prioritization in Tunneling Construction Projects

Authors: David Nantes, George Gilbert

Abstract:

There are a lot of risks that might crop up as a tunneling project develops, and it's crucial to be aware of them. Due to the unexpected nature of tunneling projects and the interconnectedness of risk occurrences, the risk assessment approach presents a significant challenge. The purpose of this study is to provide a hybrid FDEMATEL-ANP model to help prioritize risks during tunnel construction projects. The ambiguity in expert judgments and the relative severity of interdependencies across risk occurrences are both taken into consideration by this model, thanks to the Fuzzy Decision-Making Trial and Evaluation Laboratory (FDEMATEL). The Analytic Network Process (ANP) method is used to rank priorities and assess project risks. The authors provide a case study of a subway tunneling construction project to back up the validity of their methodology. The results showed that the proposed method successfully isolated key risk factors and elucidated their interplay in the case study. The proposed method has the potential to become a helpful resource for evaluating dangers associated with tunnel construction projects.

Keywords: risk, prioritization, FDEMATEL, ANP, tunneling construction projects

Procedia PDF Downloads 64
131 Modeling and Analyzing Controversy in Large-Scale Cyber-Argumentation

Authors: Najla Althuniyan

Abstract:

Online discussions take place across different platforms. These discussions have the potential to extract crowd wisdom and capture the collective intelligence from a different perspective. However, certain phenomena, such as controversy, often appear in online argumentation that makes the discussion between participants heated. Heated discussions can be used to extract new knowledge. Therefore, detecting the presence of controversy is an essential task to determine if collective intelligence can be extracted from online discussions. This paper uses existing measures for estimating controversy quantitatively in cyber-argumentation. First, it defines controversy in different fields, and then it identifies the attributes of controversy in online discussions. The distributions of user opinions and the distance between opinions are used to calculate the controversial degree of a discussion. Finally, the results from each controversy measure are discussed and analyzed using an empirical study generated by a cyber-argumentation tool. This is an improvement over the existing measurements because it does not require ground-truth data or specific settings and can be adapted to distribution-based or distance-based opinions.

Keywords: online argumentation, controversy, collective intelligence, agreement analysis, collaborative decision-making, fuzzy logic

Procedia PDF Downloads 101
130 Create a Brand Value Assessment Model to Choosing a Cosmetic Brand in Tehran Combining DEMATEL Techniques and Multi-Stage ANFIS

Authors: Hamed Saremi, Suzan Taghavy, Seyed Mohammad Hanif Sanjari, Mostafa Kahali

Abstract:

One of the challenges in manufacturing and service companies to provide a product or service is recognized Brand to consumers in target markets. They provide most of their processes under the same capacity. But the constant threat of devastating internal and external resources to prevent a rise Brands and more companies are recognizing the stages are bankrupt. This paper has tried to identify and analyze effective indicators of brand equity and focuses on indicators and presents a model of intelligent create a model to prevent possible damage. In this study, the identified indicators of brand equity are based on literature study and according to expert opinions, set of indicators By techniques DEMATEL Then to used Multi-Step Adaptive Neural-Fuzzy Inference system (ANFIS) to design a multi-stage intelligent system for assessment of brand equity.

Keywords: brand, cosmetic product, ANFIS, DEMATEL

Procedia PDF Downloads 395
129 Research on the Teaching Quality Evaluation of China’s Network Music Education APP

Authors: Guangzhuang Yu, Chun-Chu Liu

Abstract:

With the advent of the Internet era in recent years, social music education has gradually shifted from the original entity education mode to the mode of entity plus network teaching. No matter for school music education, professional music education or social music education, the teaching quality is the most important evaluation index. Regarding the research on teaching quality evaluation, scholars at home and abroad have contributed a lot of research results on the basis of multiple methods and evaluation subjects. However, to our best knowledge the complete evaluation model for the virtual teaching interaction mode of the emerging network music education Application (APP) has not been established. This research firstly found out the basic dimensions that accord with the teaching quality required by the three parties, constructing the quality evaluation index system; and then, on the basis of expounding the connotation of each index, it determined the weight of each index by using method of fuzzy analytic hierarchy process, providing ideas and methods for scientific, objective and comprehensive evaluation of the teaching quality of network education APP.

Keywords: network music education APP, teaching quality evaluation, index and connotation

Procedia PDF Downloads 99
128 An Approaching Index to Evaluate a forward Collision Probability

Authors: Yuan-Lin Chen

Abstract:

This paper presents an approaching forward collision probability index (AFCPI) for alerting and assisting driver in keeping safety distance to avoid the forward collision accident in highway driving. The time to collision (TTC) and time headway (TH) are used to evaluate the TTC forward collision probability index (TFCPI) and the TH forward collision probability index (HFCPI), respectively. The Mamdani fuzzy inference algorithm is presented combining TFCPI and HFCPI to calculate the approaching collision probability index of the vehicle. The AFCPI is easier to understand for the driver who did not even have any professional knowledge in vehicle professional field. At the same time, the driver’s behavior is taken into account for suiting each driver. For the approaching index, the value 0 is indicating the 0% probability of forward collision, and the values 0.5 and 1 are indicating the 50% and 100% probabilities of forward collision, respectively. The AFCPI is useful and easy-to-understand for alerting driver to avoid the forward collision accidents when driving in highway.

Keywords: approaching index, forward collision probability, time to collision, time headway

Procedia PDF Downloads 267
127 Product Form Bionic Design Based on Eye Tracking Data: A Case Study of Desk Lamp

Authors: Huan Lin, Liwen Pang

Abstract:

In order to reduce the ambiguity and uncertainty of product form bionic design, a product form bionic design method based on eye tracking is proposed. The eye-tracking experiment is designed to calculate the average time ranking of the specific parts of the bionic shape that the subjects are looking at. Key bionic shape is explored through the experiment and then applied to a desk lamp bionic design. During the design case, FAHP (Fuzzy Analytic Hierachy Process) and SD (Semantic Differential) method are firstly used to identify consumer emotional perception model toward desk lamp before product design. Through investigating different desk lamp design elements and consumer views, the form design factors on the desk lamp product are reflected and all design schemes are sequenced after caculation. Desk lamp form bionic design method is combined the key bionic shape extracted from eye-tracking experiment and priority of desk lamp design schemes. This study provides an objective and rational method to product form bionic design.

Keywords: Bionic design; Form; Eye tracking; FAHP; Desk lamp

Procedia PDF Downloads 190
126 A Location-Based Search Approach According to Users’ Application Scenario

Authors: Shih-Ting Yang, Chih-Yun Lin, Ming-Yu Li, Jhong-Ting Syue, Wei-Ming Huang

Abstract:

Global positioning system (GPS) has become increasing precise in recent years, and the location-based service (LBS) has developed rapidly. Take the example of finding a parking lot (such as Parking apps). The location-based service can offer immediate information about a nearby parking lot, including the information about remaining parking spaces. However, it cannot provide expected search results according to the requirement situations of users. For that reason, this paper develops a “Location-based Search Approach according to Users’ Application Scenario” according to the location-based search and demand determination to help users obtain the information consistent with their requirements. The “Location-based Search Approach based on Users’ Application Scenario” of this paper consists of one mechanism and three kernel modules. First, in the Information Pre-processing Mechanism (IPM), this paper uses the cosine theorem to categorize the locations of users. Then, in the Information Category Evaluation Module (ICEM), the kNN (k-Nearest Neighbor) is employed to classify the browsing records of users. After that, in the Information Volume Level Determination Module (IVLDM), this paper makes a comparison between the number of users’ clicking the information at different locations and the average number of users’ clicking the information at a specific location, so as to evaluate the urgency of demand; then, the two-dimensional space is used to estimate the application situations of users. For the last step, in the Location-based Search Module (LBSM), this paper compares all search results and the average number of characters of the search results, categorizes the search results with the Manhattan Distance, and selects the results according to the application scenario of users. Additionally, this paper develops a Web-based system according to the methodology to demonstrate practical application of this paper. The application scenario-based estimate and the location-based search are used to evaluate the type and abundance of the information expected by the public at specific location, so that information demanders can obtain the information consistent with their application situations at specific location.

Keywords: data mining, knowledge management, location-based service, user application scenario

Procedia PDF Downloads 92