Search results for: minimal spanning tree
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 603

Search results for: minimal spanning tree

153 PM10 Prediction and Forecasting Using CART: A Case Study for Pleven, Bulgaria

Authors: Snezhana G. Gocheva-Ilieva, Maya P. Stoimenova

Abstract:

Ambient air pollution with fine particulate matter (PM10) is a systematic permanent problem in many countries around the world. The accumulation of a large number of measurements of both the PM10 concentrations and the accompanying atmospheric factors allow for their statistical modeling to detect dependencies and forecast future pollution. This study applies the classification and regression trees (CART) method for building and analyzing PM10 models. In the empirical study, average daily air data for the city of Pleven, Bulgaria for a period of 5 years are used. Predictors in the models are seven meteorological variables, time variables, as well as lagged PM10 variables and some lagged meteorological variables, delayed by 1 or 2 days with respect to the initial time series, respectively. The degree of influence of the predictors in the models is determined. The selected best CART models are used to forecast future PM10 concentrations for two days ahead after the last date in the modeling procedure and show very accurate results.

Keywords: Cross-validation, decision tree, lagged variables, short-term forecasting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 663
152 Classification of Political Affiliations by Reduced Number of Features

Authors: Vesile Evrim, Aliyu Awwal

Abstract:

By the evolvement in technology, the way of expressing opinions switched direction to the digital world. The domain of politics, as one of the hottest topics of opinion mining research, merged together with the behavior analysis for affiliation determination in texts, which constitutes the subject of this paper. This study aims to classify the text in news/blogs either as Republican or Democrat with the minimum number of features. As an initial set, 68 features which 64 were constituted by Linguistic Inquiry and Word Count (LIWC) features were tested against 14 benchmark classification algorithms. In the later experiments, the dimensions of the feature vector reduced based on the 7 feature selection algorithms. The results show that the “Decision Tree”, “Rule Induction” and “M5 Rule” classifiers when used with “SVM” and “IGR” feature selection algorithms performed the best up to 82.5% accuracy on a given dataset. Further tests on a single feature and the linguistic based feature sets showed the similar results. The feature “Function”, as an aggregate feature of the linguistic category, was found as the most differentiating feature among the 68 features with the accuracy of 81% in classifying articles either as Republican or Democrat.

Keywords: Politics, machine learning, feature selection, LIWC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2315
151 Impact Assessment of Air Pollution Stress on Plant Species through Biochemical Estimations

Authors: Govindaraju.M, Ganeshkumar.R.S, Suganthi.P, Muthukumaran.V.R, Visvanathan.P

Abstract:

The present study was conducted to investigate the response of plants exposed to lignite-based thermal power plant emission. For this purpose, five plant species were collected from 1.0 km distance (polluted site) and control plants were collected from 20.0 km distance (control site) to thermal power plant. The common tree species Cassia siamea Lamk., Polyalthia longifolia. Sonn, Acacia longifolia (Andrews) Wild., Azadirachta indica A.Juss, Ficus religiosa L. were selected as test plants. Photosynthetic pigments changes (chlorophyll a, chlorophyll b and carotenoids) and rubisco enzyme modifications were studied. Reduction was observed in the photosynthetic pigments of plants growing in polluted site and also large sub unit of the rubisco enzyme was degraded in Azadirachta indica A. Juss collected from polluted site.

Keywords: Air pollution, Lignite-based thermal power plant, Photosynthetic pigments, Rubisco enzyme.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3141
150 Antibacterial and Antifungal Activity of Essential Oil of Eucalyptus camendulensis on a Few Bacteria and Fungi

Authors: M. Mehani, N. Salhi, T. Valeria, S. Ladjel

Abstract:

Red River Gum (Eucalyptus camaldulensis) is a tree of the genus Eucalyptus widely distributed in Algeria and in the world. The value of its aromatic secondary metabolites offers new perspectives in the pharmaceutical industry. This strategy can contribute to the sustainable development of our country. Preliminary tests performed on the essential oil of Eucalyptus camendulensis showed that this oil has antibacterial activity vis-à-vis the bacterial strains (Enterococcus feacalis, Enterobacter cloaceai, Proteus microsilis, Escherichia coli, Klebsiella pneumonia, and Pseudomonas aeruginosa) and antifungic (Fusarium sporotrichioide and Fusarium graminearum). The culture medium used was nutrient broth Muller Hinton. The interaction between the bacteria and the essential oil is expressed by a zone of inhibition with diameters of MIC indirectly expression of. And we used the PDA medium to determine the fungal activity. The extraction of the aromatic fraction (essentially oilhydrolat) of the fresh aerian part of the Eucalyptus camendulensis was performed by hydrodistillation. The average essential oil yield is 0.99%. The antimicrobial and fungal study of the essential oil and hydrosol showed a high inhibitory effect on the growth of pathogens.

Keywords: Essential oil, Eucalyptus camendulensis, bacteria and Fungi.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2946
149 Decision Trees for Predicting Risk of Mortality using Routinely Collected Data

Authors: Tessy Badriyah, Jim S. Briggs, Dave R. Prytherch

Abstract:

It is well known that Logistic Regression is the gold standard method for predicting clinical outcome, especially predicting risk of mortality. In this paper, the Decision Tree method has been proposed to solve specific problems that commonly use Logistic Regression as a solution. The Biochemistry and Haematology Outcome Model (BHOM) dataset obtained from Portsmouth NHS Hospital from 1 January to 31 December 2001 was divided into four subsets. One subset of training data was used to generate a model, and the model obtained was then applied to three testing datasets. The performance of each model from both methods was then compared using calibration (the χ2 test or chi-test) and discrimination (area under ROC curve or c-index). The experiment presented that both methods have reasonable results in the case of the c-index. However, in some cases the calibration value (χ2) obtained quite a high result. After conducting experiments and investigating the advantages and disadvantages of each method, we can conclude that Decision Trees can be seen as a worthy alternative to Logistic Regression in the area of Data Mining.

Keywords: Decision Trees, Logistic Regression, clinical outcome, risk of mortality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2476
148 Search for Flavour Changing Neutral Current Couplings of Higgs-up Sector Quarks at Future Circular Collider (FCC-eh)

Authors: I. Turk Cakir, B. Hacisahinoglu, S. Kartal, A. Yilmaz, A. Yilmaz, Z. Uysal, O. Cakir

Abstract:

In the search for new physics beyond the Standard Model, Flavour Changing Neutral Current (FCNC) is a good research field in terms of the observability at future colliders. Increased Higgs production with higher energy and luminosity in colliders is essential for verification or falsification of our knowledge of physics and predictions, and the search for new physics. Prospective electron-proton collider constituent of the Future Circular Collider project is FCC-eh. It offers great sensitivity due to its high luminosity and low interference. In this work, thq FCNC interaction vertex with off-shell top quark decay at electron-proton colliders is studied. By using MadGraph5_aMC@NLO multi-purpose event generator, observability of tuh and tch couplings are obtained with equal coupling scenario. Upper limit on branching ratio of tree level top quark FCNC decay is determined as 0.012% at FCC-eh with 1 ab ^−1 luminosity.

Keywords: FCC, FCNC, Higgs Boson, Top Quark.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 811
147 Analysis of Genetic Variations in Camel Breeds (Camelus dromedarius)

Authors: Yasser M. Saad, Amr A. El Hanafy, Saleh A. Alkarim, Hussein A. Almehdar, Elrashdy M. Redwan

Abstract:

Camels are substantial providers of transport, milk, sport, meat, shelter, security and capital in many countries, particularly in Saudi Arabia. Inter simple sequence repeat technique was used to detect the genetic variations among some camel breeds (Majaheim, Safra, Wadah, and Hamara). Actual number of alleles, effective number of alleles, gene diversity, Shannon’s information index and polymorphic bands were calculated for each evaluated camel breed. Neighbor-joining tree that re-constructed for evaluated these camel breeds showed that, Hamara breed is distantly related from the other evaluated camels. In addition, the polymorphic sites, haplotypes and nucleotide diversity were identified for some camelidae cox1 gene sequences (obtained from NCBI). The distance value between C. bactrianus and C. dromedarius (0.072) was relatively low. Analysis of genetic diversity is an important way for conserving Camelus dromedarius genetic resources.

Keywords: Camel, genetics, ISSR, cox1, neighbor-joining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1242
146 Effect of Sintering Temperature Curve in Wick Manufactured for Loop Heat Pipe

Authors: Shen-Chun Wu, Chuo-Jeng Huang, Wun-Hong Yang, Jy-Cheng Chang, Chien-Chun Kung

Abstract:

This investigation examines the effect of the sintering temperature curve in manufactured nickel powder capillary structure (wick) for a loop heat pipe (LHP). The sintering temperature curve is composed of a region of increasing temperature; a region of constant temperature and a region of declining temperature. The most important region is that in which the temperature increases, as an index in the stage in which the temperature increases. The wick of nickel powder is manufactured in the stage of fixed sintering temperature and the time between the stage of constant temperature and the stage of falling temperature. When the slope of the curve in the region of increasing temperature is unity (equivalent to 10 °C/min), the structure of the wick is complete and the heat transfer performance is optimal. The result of experiment test demonstrates that the heat transfer performance is optimal at 320W; the minimal total thermal resistance is approximately 0.18°C/W, and the heat flux is 17W/cm2; the internal parameters of the wick are an effective pore radius of 3.1 μm, a permeability of 3.25×10-13m2 and a porosity of 71%.

Keywords: Loop heat pipe (LHP), capillary structure (wick), sintered temperature curve.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2052
145 Textile Dyeing with Natural Dye from Sappan Tree (Caesalpinia sappan Linn.) Extract

Authors: Ploysai Ohama, Nattida Tumpat

Abstract:

Natural dye extracted from Caesalpinia sappan Linn. was applied to a cotton fabric and silk yarn by dyeing process. The dyestuff component of Caesalpinia sappan Linn. was extracted using water and ethanol. Analytical studies such as UV–VIS spectrophotometry and gravimetric analysis were performed on the extracts. Brazilein, the major dyestuff component of Caesalpinia sappan Linn. was confirmed in both aqueous and ethanolic extracts by UV–VIS spectrum. The color of each dyed material was investigated in terms of the CIELAB (L*, a* and b*) and K/S values. Cotton fabric dyed without mordant had a shade of reddish-brown, while those post-mordanted with aluminum potassium sulfate, ferrous sulfate and copper sulfate produced a variety of wine red to dark purple color shades. Cotton fabric and silk yarn dyeing was studied using aluminum potassium sulfate as a mordant. The observed color strength was enhanced with increase in mordant concentration.

Keywords: Natural dyes, Plant materials, Dyeing, Mordant.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4985
144 The Use of Seashell by-Products in Pervious Concrete Pavers

Authors: Dang Hanh Nguyen, Nassim Sebaibi, Mohamed Boutouil, Lydia Leleyter, Fabienne Baraud

Abstract:

Pervious concrete is a green alternative to conventional pavements with minimal fine aggregate and a high void content. Pervious concrete allows water to infiltrate through the pavement, thereby reducing the runoff and the requirement for stormwater management systems.

Seashell By-Products (SBP) are produced in an important quantity in France and are considered as waste. This work investigated to use SBP in pervious concrete and produce an even more environmentally friendly product, Pervious Concrete Pavers.

The research methodology involved substituting the coarse aggregate in the previous concrete mix design with 20%, 40% and 60% SBP. The testing showed that pervious concrete containing less than 40% SBP had strengths, permeability and void content which are comparable to the pervious concrete containing with only natural aggregate. The samples that contained 40% SBP or higher had a significant loss in strength and an increase in permeability and a void content from the control mix pervious concrete. On the basis of the results in this research, it was found that the natural aggregate can be substituted by SBP without affecting the delicate balance of a pervious concrete mix. Additional, it is recommended that the optimum replacement percentage for SBP in pervious concrete is 40 % direct replacement of natural coarse aggregate while maintaining the structural performance and drainage capabilities of the pervious concrete.

Keywords: Seashell by-products, pervious concrete pavers, permeability and mechanical strength.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4848
143 Site Selection of Traffic Camera based on Dempster-Shafer and Bagging Theory

Authors: S. Rokhsari, M. Delavar, A. Sadeghi-Niaraki, A. Abed-Elmdoust, B. Moshiri

Abstract:

Traffic incident has bad effect on all parts of society so controlling road networks with enough traffic devices could help to decrease number of accidents, so using the best method for optimum site selection of these devices could help to implement good monitoring system. This paper has considered here important criteria for optimum site selection of traffic camera based on aggregation methods such as Bagging and Dempster-Shafer concepts. In the first step, important criteria such as annual traffic flow, distance from critical places such as parks that need more traffic controlling were identified for selection of important road links for traffic camera installation, Then classification methods such as Artificial neural network and Decision tree algorithms were employed for classification of road links based on their importance for camera installation. Then for improving the result of classifiers aggregation methods such as Bagging and Dempster-Shafer theories were used.

Keywords: Aggregation, Bagging theory, Dempster-Shafer theory, Site selection

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1654
142 Effect of Increasing Road Light Luminance on Night Driving Performance of Older Adults

Authors: Said M. Easa, Maureen J. Reed, Frank Russo, Essam Dabbour, Atif Mehmood, Kathryn Curtis

Abstract:

The main objective of this study was to determine if a minimal increase in road light level (luminance) could lead to improved driving performance among older adults. Older, middleaged and younger adults were tested in a driving simulator following vision and cognitive screening. Comparisons were made for the performance of simulated night driving under two road light conditions (0.6 and 2.5 cd/m2). At each light level, the effects of self reported night driving avoidance were examined along with the vision/cognitive performance. It was found that increasing road light level from 0.6 cd/m2 to 2.5 cd/m2 resulted in improved recognition of signage on straight highway segments. The improvement depends on different driver-related factors such as vision and cognitive abilities, and confidence. On curved road sections, the results showed that driver-s performance worsened. It is concluded that while increasing road lighting may be helpful to older adults especially for sign recognition, it may also result in increased driving confidence and thus reduced attention in some driving situations.

Keywords: Driving, older adults, night-time, road lighting, attention, simulation, curves, signs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1796
141 Molecular Characterization of Free Radicals Decomposing Genes on Plant Developmental Stages

Authors: R. Haddad, K. Morris, V. Buchanan-Wollaston

Abstract:

Biochemical and molecular analysis of some antioxidant enzyme genes revealed different level of gene expression on oilseed (Brassica napus). For molecular and biochemical analysis, leaf tissues were harvested from plants at eight different developmental stages, from young to senescence. The levels of total protein and chlorophyll were increased during maturity stages of plant, while these were decreased during the last stages of plant growth. Structural analysis (nucleotide and deduced amino acid sequence, and phylogenic tree) of a complementary DNA revealed a high level of similarity for a family of Catalase genes. The expression of the gene encoded by different Catalase isoforms was assessed during different plant growth phase. No significant difference between samples was observed, when Catalase activity was statistically analyzed at different developmental stages. EST analysis exhibited different transcripts levels for a number of other relevant antioxidant genes (different isoforms of SOD and glutathione). The high level of transcription of these genes at senescence stages was indicated that these genes are senescenceinduced genes.

Keywords: Biochemical analysis, Oilseed, Expression pattern, Growth phases

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1515
140 Investigating the Thermal Characteristics of Reclaimed Solid Waste from a Landfill Site Using Thermogravimetry

Authors: S. M. Al-Salem, G.A. Leeke, H. J. Karam, R. Al-Enzi, A. T. Al-Dhafeeri, J. Wang

Abstract:

Thermogravimetry has been popularized as a thermal characterization technique since the 1950s. It aims at investigating the weight loss against both reaction time and temperature, whilst being able to characterize the evolved gases from the volatile components of the organic material being tested using an appropriate hyphenated analytical technique. In an effort to characterize and identify the reclaimed waste from an unsanitary landfill site, this approach was initiated. Solid waste (SW) reclaimed from an active landfill site in the State of Kuwait was collected and prepared for characterization in accordance with international protocols. The SW was segregated and its major components were identified after washing and air drying. Shredding and cryomilling was conducted on the plastic solid waste (PSW) component to yield a material that is representative for further testing and characterization. The material was subjected to five heating rates (b) with minimal repeatable weight for high accuracy thermogravimetric analysis (TGA) following the recommendation of the International Confederation for Thermal Analysis and Calorimetry (ICTAC). The TGA yielded thermograms that showed an off-set from typical behavior of commercial grade resin which was attributed to contact of material with soil and thermal/photo-degradation.

Keywords: Polymer, TGA, Pollution, Landfill, Waste, Plastic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 642
139 Issues in the User Interface Design of a Content Rich Vocational Training Application for Digitally Illiterate Users

Authors: Jamie Otelsberg, Nagarajan Akshay, Rao R. Bhavani

Abstract:

This paper discusses our preliminary experiences in the design of a user interface of a computerized content-rich vocational training courseware meant for users with little or no computer experience. In targeting a growing population with limited access to skills training of any sort, we faced numerous challenges, including language and cultural differences, resource limits, gender boundaries and, in many cases, the simple lack of trainee motivation. With the size of the unskilled population increasing much more rapidly than the numbers of sufficiently skilled teachers, there is little choice but to develop teaching techniques that will take advantage of emerging computer-based training technologies. However, in striving to serve populations with minimal computer literacy, one must carefully design the user interface to accommodate their cultural, social, educational, motivational and other differences. Our work, which uses computer based and haptic simulation technologies to deliver training to these populations, has provided some useful insights on potential user interface design approaches.

Keywords: User interface design, digitally illiterate, vocational training, navigation issues, computer human interaction, human factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2320
138 Near Shore Wave Manipulation for Electricity Generation

Authors: K. D. R. Jagath-Kumara, D. D. Dias

Abstract:

The sea waves carry thousands of GWs of power globally. Although there are a number of different approaches to harness offshore energy, they are likely to be expensive, practically challenging, and vulnerable to storms. Therefore, this paper considers using the near shore waves for generating mechanical and electrical power. It introduces two new approaches, the wave manipulation and using a variable duct turbine, for intercepting very wide wave fronts and coping with the fluctuations of the wave height and the sea level, respectively. The first approach effectively allows capturing much more energy yet with a much narrower turbine rotor. The second approach allows using a rotor with a smaller radius but captures energy of higher wave fronts at higher sea levels yet preventing it from totally submerging. To illustrate the effectiveness of the first approach, the paper contains a description and the simulation results of a scale model of a wave manipulator. Then, it includes the results of testing a physical model of the manipulator and a single duct, axial flow turbine in a wave flume in the laboratory. The paper also includes comparisons of theoretical predictions, simulation results, and wave flume tests with respect to the incident energy, loss in wave manipulation, minimal loss, brake torque, and the angular velocity.

Keywords: Near-shore sea waves, Renewable energy, Wave energy conversion, Wave manipulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1934
137 A Consistency Protocol Multi-Layer for Replicas Management in Large Scale Systems

Authors: Ghalem Belalem, Yahya Slimani

Abstract:

Large scale systems such as computational Grid is a distributed computing infrastructure that can provide globally available network resources. The evolution of information processing systems in Data Grid is characterized by a strong decentralization of data in several fields whose objective is to ensure the availability and the reliability of the data in the reason to provide a fault tolerance and scalability, which cannot be possible only with the use of the techniques of replication. Unfortunately the use of these techniques has a height cost, because it is necessary to maintain consistency between the distributed data. Nevertheless, to agree to live with certain imperfections can improve the performance of the system by improving competition. In this paper, we propose a multi-layer protocol combining the pessimistic and optimistic approaches conceived for the data consistency maintenance in large scale systems. Our approach is based on a hierarchical representation model with tree layers, whose objective is with double vocation, because it initially makes it possible to reduce response times compared to completely pessimistic approach and it the second time to improve the quality of service compared to an optimistic approach.

Keywords: Data Grid, replication, consistency, optimistic approach, pessimistic approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1526
136 Developing Rice Disease Analysis System on Mobile via iOS Operating System

Authors: Rujijan Vichivanives, Kittiya Poonsilp, Canasanan Wanavijit

Abstract:

This research aims to create mobile tools to analyze rice disease quickly and easily. The principle of object-oriented software engineering and objective-C language were used for software development methodology and the principle of decision tree technique was used for analysis method. Application users can select the features of rice disease or the color appears on the rice leaves for recognition analysis results on iOS mobile screen. After completing the software development, unit testing and integrating testing method were used to check for program validity. In addition, three plant experts and forty farmers have been assessed for usability and benefit of this system. The overall of users’ satisfaction was found in a good level, 57%. The plant experts give a comment on the addition of various disease symptoms in the database for more precise results of the analysis. For further research, it is suggested that image processing system should be developed as a tool that allows users search and analyze for rice diseases more convenient with great accuracy.

Keywords: Rice disease, analysis system, mobile application, iOS operating system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1229
135 Using A Hybrid Algorithm to Improve the Quality of Services in Multicast Routing Problem

Authors: Mohammad Reza Karami Nejad

Abstract:

A hybrid learning automata-genetic algorithm (HLGA) is proposed to solve QoS routing optimization problem of next generation networks. The algorithm complements the advantages of the learning Automato Algorithm(LA) and Genetic Algorithm(GA). It firstly uses the good global search capability of LA to generate initial population needed by GA, then it uses GA to improve the Quality of Service(QoS) and acquiring the optimization tree through new algorithms for crossover and mutation operators which are an NP-Complete problem. In the proposed algorithm, the connectivity matrix of edges is used for genotype representation. Some novel heuristics are also proposed for mutation, crossover, and creation of random individuals. We evaluate the performance and efficiency of the proposed HLGA-based algorithm in comparison with other existing heuristic and GA-based algorithms by the result of simulation. Simulation results demonstrate that this paper proposed algorithm not only has the fast calculating speed and high accuracy but also can improve the efficiency in Next Generation Networks QoS routing. The proposed algorithm has overcome all of the previous algorithms in the literature.

Keywords: Routing, Quality of Service, Multicaset, Learning Automata, Genetic, Next Generation Networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1692
134 Automatic Reusability Appraisal of Software Components using Neuro-fuzzy Approach

Authors: Parvinder S. Sandhu, Hardeep Singh

Abstract:

Automatic reusability appraisal could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable components from existing legacy systems; that can save cost of developing the software from scratch. But the issue of how to identify reusable components from existing systems has remained relatively unexplored. In this paper, we have mentioned two-tier approach by studying the structural attributes as well as usability or relevancy of the component to a particular domain. Latent semantic analysis is used for the feature vector representation of various software domains. It exploits the fact that FeatureVector codes can be seen as documents containing terms -the idenifiers present in the components- and so text modeling methods that capture co-occurrence information in low-dimensional spaces can be used. Further, we devised Neuro- Fuzzy hybrid Inference System, which takes structural metric values as input and calculates the reusability of the software component. Decision tree algorithm is used to decide initial set of fuzzy rules for the Neuro-fuzzy system. The results obtained are convincing enough to propose the system for economical identification and retrieval of reusable software components.

Keywords: Clustering, ID3, LSA, Neuro-fuzzy System, SVD

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1614
133 Novel Rao-Blackwellized Particle Filter for Mobile Robot SLAM Using Monocular Vision

Authors: Maohai Li, Bingrong Hong, Zesu Cai, Ronghua Luo

Abstract:

This paper presents the novel Rao-Blackwellised particle filter (RBPF) for mobile robot simultaneous localization and mapping (SLAM) using monocular vision. The particle filter is combined with unscented Kalman filter (UKF) to extending the path posterior by sampling new poses that integrate the current observation which drastically reduces the uncertainty about the robot pose. The landmark position estimation and update is also implemented through UKF. Furthermore, the number of resampling steps is determined adaptively, which seriously reduces the particle depletion problem, and introducing the evolution strategies (ES) for avoiding particle impoverishment. The 3D natural point landmarks are structured with matching Scale Invariant Feature Transform (SIFT) feature pairs. The matching for multi-dimension SIFT features is implemented with a KD-Tree in the time cost of O(log2 N). Experiment results on real robot in our indoor environment show the advantages of our methods over previous approaches.

Keywords: Mobile robot, simultaneous localization and mapping, Rao-Blackwellised particle filter, evolution strategies, scale invariant feature transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2092
132 The Effects of Applying Linguistic Principles and Teaching Techniques in Teaching English at Secondary School in Thailand

Authors: Wannakarn Likitrattanaporn

Abstract:

The ultimate purpose of this investigation was to determine the teachers’ opinions as well as students’ opinions towards the Adapted English Lessons. The subjects of the study were 5 Thai teachers, who teach English, and 85 Grade 10 mixed-ability students at Triamudom Suksa Pattanakarn Ratchada School, Bangkok, Thailand. The research instruments included questionnaires and the informal interview. The data from the research instruments was collected and analyzed concerning linguistic principles of minimal pair and articulatory phonetics as well as teaching techniques of mimicry-memorization; vocabulary substitution drills, language pattern drills, reading comprehension exercise, practicing listening, speaking and writing skill and communicative activities; informal talk and free writing. The data was statistically compiled according to an arithmetic percentage. The results showed that the teachers and students have very highly positive opinions towards adapting linguistic principles for teaching and learning phonological accuracy. Teaching techniques provided in the Adapted English Lessons can be used efficiently in the classroom. The teachers and students have positive opinions towards them too.

Keywords: Applying linguistic principles and teaching techniques, teachers’ and students’ opinions, teaching English, the Adapted English Lessons.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1673
131 Aerodynamic Design of Three-Dimensional Bellmouth for Low-Speed Open-Circuit Wind Tunnel

Authors: Harshavardhan Reddy, Balaji Subramanian

Abstract:

A systematic parametric study to find the optimum Bellmouth profile by relating geometric and performance parameters to satisfy a set of specifications is reported. A careful aerodynamic design of Bellmouth intake is critical to properly direct the flow with minimal losses and maximal flow uniformity into the honeycomb located inside the settling chamber of an indraft wind tunnel, thus improving the efficiency of the entire unit. Design charts for elliptically profiled Bellmouth's with two different contraction ratios (9 and 18) and three different test section speeds (25 m/s, 50 m/s, and 75 m/s) were presented. A significant performance improvement - especially in the coefficient of discharge and in the flow angularity and boundary layer thickness at the honeycomb inlet - was observed when an entry corner radius (r/D = 0.08) was added to the Bellmouth profile. The nonuniformity at the honeycomb inlet drops by about three times (~1% to 0.3%) when moving from square to regular octagonal cross-section. An octagonal cross-sectioned Bellmouth intake with L/d = 0.55, D/d = 1.625, and r/D = 0.08 met all the four target performance specifications and is proposed as the best choice for a low-speed wind tunnel.

Keywords: Bellmouth intake, low-speed wind tunnel, coefficient of discharge, nonuniformity, flow angularity, boundary layer thickness, CFD, aerodynamics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 652
130 A 3D Approach for Extraction of the Coronaryartery and Quantification of the Stenosis

Authors: Mahdi Mazinani, S. D. Qanadli, Rahil Hosseini, Tim Ellis, Jamshid Dehmeshki

Abstract:

Segmentation and quantification of stenosis is an important task in assessing coronary artery disease. One of the main challenges is measuring the real diameter of curved vessels. Moreover, uncertainty in segmentation of different tissues in the narrow vessel is an important issue that affects accuracy. This paper proposes an algorithm to extract coronary arteries and measure the degree of stenosis. Markovian fuzzy clustering method is applied to model uncertainty arises from partial volume effect problem. The algorithm employs: segmentation, centreline extraction, estimation of orthogonal plane to centreline, measurement of the degree of stenosis. To evaluate the accuracy and reproducibility, the approach has been applied to a vascular phantom and the results are compared with real diameter. The results of 10 patient datasets have been visually judged by a qualified radiologist. The results reveal the superiority of the proposed method compared to the Conventional thresholding Method (CTM) on both datasets.

Keywords: 3D coronary artery tree extraction, segmentation, quantification, fuzzy clustering, and Markov random field

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1546
129 CPT Pore Water Pressure Correlations with PDA to Identify Pile Drivability Problem

Authors: Fauzi Jarushi, Paul Cosentino, Edward Kalajian, Hadeel Dekhn

Abstract:

At certain depths during large diameter displacement pile driving, rebound well over 0.25 inches was experienced, followed by a small permanent-set during each hammer blow. High pile rebound (HPR) soils may stop the pile driving and results in a limited pile capacity. In some cases, rebound leads to pile damage, delaying the construction project, and the requiring foundations redesign. HPR was evaluated at seven Florida sites, during driving of square precast, prestressed concrete piles driven into saturated, fine silty to clayey sands and sandy clays. Pile Driving Analyzer (PDA) deflection versus time data recorded during installation, was used to develop correlations between cone penetrometer (CPT) pore-water pressures, pile displacements and rebound. At five sites where piles experienced excessive HPR with minimal set, the pore pressure yielded very high positive values of greater than 20 tsf. However, at the site where the pile rebounded, followed by an acceptable permanent-set, the measured pore pressure ranged between 5 and 20 tsf. The pore pressure exhibited values of less than 5 tsf at the site where no rebound was noticed. In summary, direct correlations between CPTu pore pressure and rebound were produced, allowing identification of soils that produce HPR.

Keywords: CPTu, pore water pressure, pile rebound.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2637
128 Parametric Approach for Reserve Liability Estimate in Mortgage Insurance

Authors: Rajinder Singh, Ram Valluru

Abstract:

Chain Ladder (CL) method, Expected Loss Ratio (ELR) method and Bornhuetter-Ferguson (BF) method, in addition to more complex transition-rate modeling, are commonly used actuarial reserving methods in general insurance. There is limited published research about their relative performance in the context of Mortgage Insurance (MI). In our experience, these traditional techniques pose unique challenges and do not provide stable claim estimates for medium to longer term liabilities. The relative strengths and weaknesses among various alternative approaches revolve around: stability in the recent loss development pattern, sufficiency and reliability of loss development data, and agreement/disagreement between reported losses to date and ultimate loss estimate. CL method results in volatile reserve estimates, especially for accident periods with little development experience. The ELR method breaks down especially when ultimate loss ratios are not stable and predictable. While the BF method provides a good tradeoff between the loss development approach (CL) and ELR, the approach generates claim development and ultimate reserves that are disconnected from the ever-to-date (ETD) development experience for some accident years that have more development experience. Further, BF is based on subjective a priori assumption. The fundamental shortcoming of these methods is their inability to model exogenous factors, like the economy, which impact various cohorts at the same chronological time but at staggered points along their life-time development. This paper proposes an alternative approach of parametrizing the loss development curve and using logistic regression to generate the ultimate loss estimate for each homogeneous group (accident year or delinquency period). The methodology was tested on an actual MI claim development dataset where various cohorts followed a sigmoidal trend, but levels varied substantially depending upon the economic and operational conditions during the development period spanning over many years. The proposed approach provides the ability to indirectly incorporate such exogenous factors and produce more stable loss forecasts for reserving purposes as compared to the traditional CL and BF methods.

Keywords: Actuarial loss reserving techniques, logistic regression, parametric function, volatility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 366
127 An Enhanced Distributed System to improve theTime Complexity of Binary Indexed Trees

Authors: Ahmed M. Elhabashy, A. Baes Mohamed, Abou El Nasr Mohamad

Abstract:

Distributed Computing Systems are usually considered the most suitable model for practical solutions of many parallel algorithms. In this paper an enhanced distributed system is presented to improve the time complexity of Binary Indexed Trees (BIT). The proposed system uses multi-uniform processors with identical architectures and a specially designed distributed memory system. The analysis of this system has shown that it has reduced the time complexity of the read query to O(Log(Log(N))), and the update query to constant complexity, while the naive solution has a time complexity of O(Log(N)) for both queries. The system was implemented and simulated using VHDL and Verilog Hardware Description Languages, with xilinx ISE 10.1, as the development environment and ModelSim 6.1c, similarly as the simulation tool. The simulation has shown that the overhead resulting by the wiring and communication between the system fragments could be fairly neglected, which makes it applicable to practically reach the maximum speed up offered by the proposed model.

Keywords: Binary Index Tree (BIT), Least Significant Bit (LSB), Parallel Adder (PA), Very High Speed Integrated Circuits HardwareDescription Language (VHDL), Distributed Parallel Computing System(DPCS).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1722
126 Supervisor Controller-Based Colored Petri Nets for Deadlock Control and Machine Failures in Automated Manufacturing Systems

Authors: Husam Kaid, Abdulrahman Al-Ahmari, Zhiwu Li

Abstract:

This paper develops a robust deadlock control technique for shared and unreliable resources in automated manufacturing systems (AMSs) based on structural analysis and colored Petri nets, which consists of three steps. The first step involves using strict minimal siphon control to create a live (deadlock-free) system that does not consider resource failure. The second step uses an approach based on colored Petri net, in which all monitors designed in the first step are merged into a single monitor. The third step addresses the deadlock control problems caused by resource failures. For all resource failures in the Petri net model a common recovery subnet based on colored petri net is proposed. The common recovery subnet is added to the obtained system at the second step to make the system reliable. The proposed approach is evaluated using an AMS from the literature. The results show that the proposed approach can be applied to an unreliable complex Petri net model, has a simpler structure and less computational complexity, and can obtain one common recovery subnet to model all resource failures.

Keywords: Automated manufacturing system, colored Petri net, deadlock, siphon.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 413
125 Probe Selection for Pathway-Specific Microarray Probe Design Minimizing Melting Temperature Variance

Authors: Fabian Horn, Reinhard Guthke

Abstract:

In molecular biology, microarray technology is widely and successfully utilized to efficiently measure gene activity. If working with less studied organisms, methods to design custom-made microarray probes are available. One design criterion is to select probes with minimal melting temperature variances thus ensuring similar hybridization properties. If the microarray application focuses on the investigation of metabolic pathways, it is not necessary to cover the whole genome. It is more efficient to cover each metabolic pathway with a limited number of genes. Firstly, an approach is presented which minimizes the overall melting temperature variance of selected probes for all genes of interest. Secondly, the approach is extended to include the additional constraints of covering all pathways with a limited number of genes while minimizing the overall variance. The new optimization problem is solved by a bottom-up programming approach which reduces the complexity to make it computationally feasible. The new method is exemplary applied for the selection of microarray probes in order to cover all fungal secondary metabolite gene clusters for Aspergillus terreus.

Keywords: bottom-up approach, gene clusters, melting temperature, metabolic pathway, microarray probe design, probe selection

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1515
124 Formal Analysis of a Public-Key Algorithm

Authors: Markus Kaiser, Johannes Buchmann

Abstract:

In this article, a formal specification and verification of the Rabin public-key scheme in a formal proof system is presented. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. A major objective of this article is the presentation of the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Moreover, we explicate a (computer-proven) formalization of correctness as well as a computer verification of security properties using a straight-forward computation model in Isabelle/HOL. The analysis uses a given database to prove formal properties of our implemented functions with computer support. The main task in designing a practical formalization of correctness as well as efficient computer proofs of security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as efficient formal proofs. Consequently, we get reliable proofs with a minimal error rate augmenting the used database, what provides a formal basis for more computer proof constructions in this area.

Keywords: public-key encryption, Rabin public-key scheme, formalproof system, higher-order logic, formal verification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1492