Search results for: Local weighted kernel
1721 2D Rigid Registration of MR Scans using the 1d Binary Projections
Authors: Panos D. Kotsas
Abstract:
This paper presents the application of a signal intensity independent registration criterion for 2D rigid body registration of medical images using 1D binary projections. The criterion is defined as the weighted ratio of two projections. The ratio is computed on a pixel per pixel basis and weighting is performed by setting the ratios between one and zero pixels to a standard high value. The mean squared value of the weighted ratio is computed over the union of the one areas of the two projections and it is minimized using the Chebyshev polynomial approximation using n=5 points. The sum of x and y projections is used for translational adjustment and a 45deg projection for rotational adjustment. 20 T1- T2 registration experiments were performed and gave mean errors 1.19deg and 1.78 pixels. The method is suitable for contour/surface matching. Further research is necessary to determine the robustness of the method with regards to threshold, shape and missing data.Keywords: Medical image, projections, registration, rigid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13451720 Hyperspectral Imaging and Nonlinear Fukunaga-Koontz Transform Based Food Inspection
Authors: Hamidullah Binol, Abdullah Bal
Abstract:
Nowadays, food safety is a great public concern; therefore, robust and effective techniques are required for detecting the safety situation of goods. Hyperspectral Imaging (HSI) is an attractive material for researchers to inspect food quality and safety estimation such as meat quality assessment, automated poultry carcass inspection, quality evaluation of fish, bruise detection of apples, quality analysis and grading of citrus fruits, bruise detection of strawberry, visualization of sugar distribution of melons, measuring ripening of tomatoes, defect detection of pickling cucumber, and classification of wheat kernels. HSI can be used to concurrently collect large amounts of spatial and spectral data on the objects being observed. This technique yields with exceptional detection skills, which otherwise cannot be achieved with either imaging or spectroscopy alone. This paper presents a nonlinear technique based on kernel Fukunaga-Koontz transform (KFKT) for detection of fat content in ground meat using HSI. The KFKT which is the nonlinear version of FKT is one of the most effective techniques for solving problems involving two-pattern nature. The conventional FKT method has been improved with kernel machines for increasing the nonlinear discrimination ability and capturing higher order of statistics of data. The proposed approach in this paper aims to segment the fat content of the ground meat by regarding the fat as target class which is tried to be separated from the remaining classes (as clutter). We have applied the KFKT on visible and nearinfrared (VNIR) hyperspectral images of ground meat to determine fat percentage. The experimental studies indicate that the proposed technique produces high detection performance for fat ratio in ground meat.Keywords: Food (Ground meat) inspection, Fukunaga-Koontz transform, hyperspectral imaging, kernel methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15001719 A Stochastic Analytic Hierarchy Process Based Weighting Model for Sustainability Measurement in an Organization
Authors: Faramarz Khosravi, Gokhan Izbirak
Abstract:
A weighted statistical stochastic based Analytical Hierarchy Process (AHP) model for modeling the potential barriers and enablers of sustainability for measuring and assessing the sustainability level is proposed. For context-dependent potential barriers and enablers, the proposed model takes the basis of the properties of the variables describing the sustainability functions and was developed into a realistic analytical model for the sustainable behavior of an organization. This thus serves as a means for measuring the sustainability of the organization. The main focus of this paper was the application of the AHP tool in a statistically-based model for measuring sustainability. Hence a strong weighted stochastic AHP based procedure was achieved. A case study scenario of a widely reported major Canadian electric utility was adopted to demonstrate the applicability of the developed model and comparatively examined its results with those of an equal-weighted model method. Variations in the sustainability of a company, as fluctuations, were figured out during the time. In the results obtained, sustainability index for successive years changed form 73.12%, 79.02%, 74.31%, 76.65%, 80.49%, 79.81%, 79.83% to more exact values 73.32%, 77.72%, 76.76%, 79.41%, 81.93%, 79.72%, and 80,45% according to priorities of factors that have found by expert views, respectively. By obtaining relatively necessary informative measurement indicators, the model can practically and effectively evaluate the sustainability extent of any organization and also to determine fluctuations in the organization over time.
Keywords: AHP, sustainability fluctuation, environmental indicators, performance measurement, environmental sustainability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9241718 MEGSOR Iterative Scheme for the Solution of 2D Elliptic PDE's
Authors: J. Sulaiman, M. Othman, M. K. Hasan
Abstract:
Recently, the findings on the MEG iterative scheme has demonstrated to accelerate the convergence rate in solving any system of linear equations generated by using approximation equations of boundary value problems. Based on the same scheme, the aim of this paper is to investigate the capability of a family of four-point block iterative methods with a weighted parameter, ω such as the 4 Point-EGSOR, 4 Point-EDGSOR, and 4 Point-MEGSOR in solving two-dimensional elliptic partial differential equations by using the second-order finite difference approximation. In fact, the formulation and implementation of three four-point block iterative methods are also presented. Finally, the experimental results show that the Four Point MEGSOR iterative scheme is superior as compared with the existing four point block schemes.
Keywords: MEG iteration, second-order finite difference, weighted parameter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17021717 An EWMA p Chart Based On Improved Square Root Transformation
Authors: S. Sukparungsee
Abstract:
Generally, the traditional Shewhart p chart has been developed by for charting the binomial data. This chart has been developed using the normal approximation with condition as low defect level and the small to moderate sample size. In real applications, however, are away from these assumptions due to skewness in the exact distribution. In this paper, a modified Exponentially Weighted Moving Average (EWMA) control chat for detecting a change in binomial data by improving square root transformations, namely ISRT p EWMA control chart. The numerical results show that ISRT p EWMA chart is superior to ISRT p chart for small to moderate shifts, otherwise, the latter is better for large shifts.
Keywords: Number of defects, Exponentially Weighted Moving Average, Average Run Length, Square root transformations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24851716 A Kernel Based Rejection Method for Supervised Classification
Authors: Abdenour Bounsiar, Edith Grall, Pierre Beauseroy
Abstract:
In this paper we are interested in classification problems with a performance constraint on error probability. In such problems if the constraint cannot be satisfied, then a rejection option is introduced. For binary labelled classification, a number of SVM based methods with rejection option have been proposed over the past few years. All of these methods use two thresholds on the SVM output. However, in previous works, we have shown on synthetic data that using thresholds on the output of the optimal SVM may lead to poor results for classification tasks with performance constraint. In this paper a new method for supervised classification with rejection option is proposed. It consists in two different classifiers jointly optimized to minimize the rejection probability subject to a given constraint on error rate. This method uses a new kernel based linear learning machine that we have recently presented. This learning machine is characterized by its simplicity and high training speed which makes the simultaneous optimization of the two classifiers computationally reasonable. The proposed classification method with rejection option is compared to a SVM based rejection method proposed in recent literature. Experiments show the superiority of the proposed method.Keywords: rejection, Chow's rule, error-reject tradeoff, SupportVector Machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14441715 An Augmented Automatic Choosing Control with Constrained Input Using Weighted Gradient Optimization Automatic Choosing Functions
Authors: Toshinori Nawata
Abstract:
In this paper we consider a nonlinear feedback control called augmented automatic choosing control (AACC) for nonlinear systems with constrained input using weighted gradient optimization automatic choosing functions. Constant term which arises from linearization of a given nonlinear system is treated as a coefficient of a stable zero dynamics. Parameters of the control are suboptimally selected by maximizing the stable region in the sense of Lyapunov with the aid of a genetic algorithm. This approach is applied to a field excitation control problem of power system to demonstrate the splendidness of the AACC. Simulation results show that the new controller can improve performance remarkably well.
Keywords: Augmented automatic choosing control, nonlinear control, genetic algorithm, zero dynamics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18431714 Object Detection based Weighted-Center Surround Difference
Authors: Seung-Hun Kim, Kye-Hoon Jeon, Byoung-Doo Kang, I1-Kyun Jung
Abstract:
Intelligent traffic surveillance technology is an issue in the field of traffic data analysis. Therefore, we need the technology to detect moving objects in real-time while there are variations in background and natural light. In this paper, we proposed a Weighted-Center Surround Difference method for object detection in outdoor environments. The proposed system detects objects using the saliency map that is obtained by analyzing the weight of each layers of Gaussian pyramid. In order to validate the effectiveness of our system, we implemented the proposed method using a digital signal processor, TMS320DM6437. Experimental results show that blurred noisy around objects was effectively eliminated and the object detection accuracy is improved.Keywords: Saliency Map, Center Surround Difference, Object Detection, Surveillance System
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17361713 Hybrid Artificial Bee Colony and Least Squares Method for Rule-Based Systems Learning
Authors: Ahcene Habbi, Yassine Boudouaoui
Abstract:
This paper deals with the problem of automatic rule generation for fuzzy systems design. The proposed approach is based on hybrid artificial bee colony (ABC) optimization and weighted least squares (LS) method and aims to find the structure and parameters of fuzzy systems simultaneously. More precisely, two ABC based fuzzy modeling strategies are presented and compared. The first strategy uses global optimization to learn fuzzy models, the second one hybridizes ABC and weighted least squares estimate method. The performances of the proposed ABC and ABC-LS fuzzy modeling strategies are evaluated on complex modeling problems and compared to other advanced modeling methods.
Keywords: Automatic design, learning, fuzzy rules, hybrid, swarm optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21571712 Weighted-Distance Sliding Windows and Cooccurrence Graphs for Supporting Entity-Relationship Discovery in Unstructured Text
Authors: Paolo Fantozzi, Luigi Laura, Umberto Nanni
Abstract:
The problem of Entity relation discovery in structured data, a well covered topic in literature, consists in searching within unstructured sources (typically, text) in order to find connections among entities. These can be a whole dictionary, or a specific collection of named items. In many cases machine learning and/or text mining techniques are used for this goal. These approaches might be unfeasible in computationally challenging problems, such as processing massive data streams. A faster approach consists in collecting the cooccurrences of any two words (entities) in order to create a graph of relations - a cooccurrence graph. Indeed each cooccurrence highlights some grade of semantic correlation between the words because it is more common to have related words close each other than having them in the opposite sides of the text. Some authors have used sliding windows for such problem: they count all the occurrences within a sliding windows running over the whole text. In this paper we generalise such technique, coming up to a Weighted-Distance Sliding Window, where each occurrence of two named items within the window is accounted with a weight depending on the distance between items: a closer distance implies a stronger evidence of a relationship. We develop an experiment in order to support this intuition, by applying this technique to a data set consisting in the text of the Bible, split into verses.Keywords: Cooccurrence graph, entity relation graph, unstructured text, weighted distance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6841711 Breast Cancer Survivability Prediction via Classifier Ensemble
Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia
Abstract:
This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.Keywords: Classifier ensemble, breast cancer survivability, data mining, SEER.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16711710 Analysis of Risk-Based Disaster Planning in Local Communities
Authors: R. A. Temah, L. A. Nkengla-Asi
Abstract:
Planning for future disasters sets the stage for a variety of activities that may trigger multiple recurring operations and expose the community to opportunities to minimize risks. Local communities are increasingly embracing the necessity for planning based on local risks, but are also significantly challenged to effectively plan and response to disasters. This research examines basic risk-based disaster planning model and compares it with advanced risk-based planning that introduces the identification and alignment of varieties of local capabilities within and out of the local community that can be pivotal to facilitate the management of local risks and cascading effects prior to a disaster. A critical review shows that the identification and alignment of capabilities can potentially enhance risk-based disaster planning. A tailored holistic approach to risk based disaster planning is pivotal to enhance collective action and a reduction in disaster collective cost.
Keywords: Capabilities, disaster planning, hazards, local community, risk-based.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10681709 Decision Making with Dempster-Shafer Theory of Evidence Using Geometric Operators
Authors: José M. Merigó, Montserrat Casanovas
Abstract:
We study the problem of decision making with Dempster-Shafer belief structure. We analyze the previous work developed by Yager about using the ordered weighted averaging (OWA) operator in the aggregation of the Dempster-Shafer decision process. We discuss the possibility of aggregating with an ascending order in the OWA operator for the cases where the smallest value is the best result. We suggest the introduction of the ordered weighted geometric (OWG) operator in the Dempster-Shafer framework. In this case, we also discuss the possibility of aggregating with an ascending order and we find that it is completely necessary as the OWG operator cannot aggregate negative numbers. Finally, we give an illustrative example where we can see the different results obtained by using the OWA, the Ascending OWA (AOWA), the OWG and the Ascending OWG (AOWG) operator.
Keywords: Decision making, aggregation operators, Dempster- Shafer theory of evidence, Uncertainty, OWA operator, OWG operator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15281708 Extended Constraint Mask Based One-Bit Transform for Low-Complexity Fast Motion Estimation
Authors: Oğuzhan Urhan
Abstract:
In this paper, an improved motion estimation (ME) approach based on weighted constrained one-bit transform is proposed for block-based ME employed in video encoders. Binary ME approaches utilize low bit-depth representation of the original image frames with a Boolean exclusive-OR based hardware efficient matching criterion to decrease computational burden of the ME stage. Weighted constrained one-bit transform (WC‑1BT) based approach improves the performance of conventional C-1BT based ME employing 2-bit depth constraint mask instead of a 1-bit depth mask. In this work, the range of constraint mask is further extended to increase ME performance of WC-1BT approach. Experiments reveal that the proposed method provides better ME accuracy compared existing similar ME methods in the literature.
Keywords: Fast motion estimation, low-complexity motion estimation, video coding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8541707 A Developmental Survey of Local Stereo Matching Algorithms
Authors: André Smith, Amr Abdel-Dayem
Abstract:
This paper presents an overview of the history and development of stereo matching algorithms. Details from its inception, up to relatively recent techniques are described, noting challenges that have been surmounted across these past decades. Different components of these are explored, though focus is directed towards the local matching techniques. While global approaches have existed for some time, and demonstrated greater accuracy than their counterparts, they are generally quite slow. Many strides have been made more recently, allowing local methods to catch up in terms of accuracy, without sacrificing the overall performance.Keywords: Developmental survey, local stereo matching, stereo correspondence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14671706 Parameter Estimation for Viewing Rank Distribution of Video-on-Demand
Authors: Hyoup-Sang Yoon
Abstract:
Video-on-demand (VOD) is designed by using content delivery networks (CDN) to minimize the overall operational cost and to maximize scalability. Estimation of the viewing pattern (i.e., the relationship between the number of viewings and the ranking of VOD contents) plays an important role in minimizing the total operational cost and maximizing the performance of the VOD systems. In this paper, we have analyzed a large body of commercial VOD viewing data and found that the viewing rank distribution fits well with the parabolic fractal distribution. The weighted linear model fitting function is used to estimate the parameters (coefficients) of the parabolic fractal distribution. This paper presents an analytical basis for designing an optimal hierarchical VOD contents distribution system in terms of its cost and performance.
Keywords: VOD, CDN, parabolic fractal distribution, viewing rank, weighted linear model fitting
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17901705 Critical Buckling Load of Carbon Nanotube with Non-Local Timoshenko Beam Using the Differential Transform Method
Authors: Tayeb Bensattalah, Mohamed Zidour, Mohamed Ait Amar Meziane, Tahar Hassaine Daouadji, Abdelouahed Tounsi
Abstract:
In this paper, the Differential Transform Method (DTM) is employed to predict and to analysis the non-local critical buckling loads of carbon nanotubes with various end conditions and the non-local Timoshenko beam described by single differential equation. The equation differential of buckling of the nanobeams is derived via a non-local theory and the solution for non-local critical buckling loads is finding by the DTM. The DTM is introduced briefly. It can easily be applied to linear or nonlinear problems and it reduces the size of computational work. Influence of boundary conditions, the chirality of carbon nanotube and aspect ratio on non-local critical buckling loads are studied and discussed. Effects of nonlocal parameter, ratios L/d, the chirality of single-walled carbon nanotube, as well as the boundary conditions on buckling of CNT are investigated.
Keywords: Boundary conditions, buckling, non-local, the differential transform method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9611704 Cross Layer Optimization for Fairness Balancing Based on Adaptively Weighted Utility Functions in OFDMA Systems
Authors: Jianwei Wang, Timo Korhonen, Yuping Zhao
Abstract:
Cross layer optimization based on utility functions has been recently studied extensively, meanwhile, numerous types of utility functions have been examined in the corresponding literature. However, a major drawback is that most utility functions take a fixed mathematical form or are based on simple combining, which can not fully exploit available information. In this paper, we formulate a framework of cross layer optimization based on Adaptively Weighted Utility Functions (AWUF) for fairness balancing in OFDMA networks. Under this framework, a two-step allocation algorithm is provided as a sub-optimal solution, whose control parameters can be updated in real-time to accommodate instantaneous QoS constrains. The simulation results show that the proposed algorithm achieves high throughput while balancing the fairness among multiple users.Keywords: OFDMA, Fairness, AWUF, QoS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18261703 Extended Set of DCT-TPLBP and DCT-FPLBP for Face Recognition
Authors: El Mahdi Barrah, Said Safi, Abdessamad Malaoui
Abstract:
In this paper, we describe an application for face recognition. Many studies have used local descriptors to characterize a face, the performance of these local descriptors remain low by global descriptors (working on the entire image). The application of local descriptors (cutting image into blocks) must be able to store both the advantages of global and local methods in the Discrete Cosine Transform (DCT) domain. This system uses neural network techniques. The letter method provides a good compromise between the two approaches in terms of simplifying of calculation and classifying performance. Finally, we compare our results with those obtained from other local and global conventional approaches.Keywords: Face detection, face recognition, discrete cosine transform (DCT), FPLBP, TPLBP, NN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19731702 Application of RS and GIS Technique for Identifying Groundwater Potential Zone in Gomukhi Nadhi Sub Basin, South India
Authors: Punitha Periyasamy, Mahalingam Sudalaimuthu, Sachikanta Nanda, Arasu Sundaram
Abstract:
India holds 17.5% of the world’s population but has only 2% of the total geographical area of the world where 27.35% of the area is categorized as wasteland due to lack of or less groundwater. So there is a demand for excessive groundwater for agricultural and non agricultural activities to balance its growth rate. With this in mind, an attempt is made to find the groundwater potential zone in Gomukhi Nadhi sub basin of Vellar River basin, TamilNadu, India covering an area of 1146.6 Sq.Km consists of 9 blocks from Peddanaickanpalayam to Virudhachalam in the sub basin. The thematic maps such as Geology, Geomorphology, Lineament, Landuse and Landcover and Drainage are prepared for the study area using IRS P6 data. The collateral data includes rainfall, water level, soil map are collected for analysis and inference. The digital elevation model (DEM) is generated using Shuttle Radar Topographic Mission (SRTM) and the slope of the study area is obtained. ArcGIS 10.1 acts as a powerful spatial analysis tool to find out the ground water potential zones in the study area by means of weighted overlay analysis. Each individual parameter of the thematic maps are ranked and weighted in accordance with their influence to increase the water level in the ground. The potential zones in the study area are classified viz., Very Good, Good, Moderate, Poor with its aerial extent of 15.67, 381.06, 575.38, 174.49 Sq.Km respectively.
Keywords: ArcGIS, DEM, Groundwater, Recharge, Weighted Overlay.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29931701 Clustering in WSN Based on Minimum Spanning Tree Using Divide and Conquer Approach
Authors: Uttam Vijay, Nitin Gupta
Abstract:
Due to heavy energy constraints in WSNs clustering is an efficient way to manage the energy in sensors. There are many methods already proposed in the area of clustering and research is still going on to make clustering more energy efficient. In our paper we are proposing a minimum spanning tree based clustering using divide and conquer approach. The MST based clustering was first proposed in 1970’s for large databases. Here we are taking divide and conquer approach and implementing it for wireless sensor networks with the constraints attached to the sensor networks. This Divide and conquer approach is implemented in a way that we don’t have to construct the whole MST before clustering but we just find the edge which will be the part of the MST to a corresponding graph and divide the graph in clusters there itself if that edge from the graph can be removed judging on certain constraints and hence saving lot of computation.
Keywords: Algorithm, Clustering, Edge-Weighted Graph, Weighted-LEACH.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24741700 Optimal Design for SARMA(P,Q)L Process of EWMA Control Chart
Authors: Y. Areepong
Abstract:
The main goal of this paper is to study Statistical Process Control (SPC) with Exponentially Weighted Moving Average (EWMA) control chart when observations are serially-correlated. The characteristic of control chart is Average Run Length (ARL) which is the average number of samples taken before an action signal is given. Ideally, an acceptable ARL of in-control process should be enough large, so-called (ARL0). Otherwise it should be small when the process is out-of-control, so-called Average of Delay Time (ARL1) or a mean of true alarm. We find explicit formulas of ARL for EWMA control chart for Seasonal Autoregressive and Moving Average processes (SARMA) with Exponential white noise. The results of ARL obtained from explicit formula and Integral equation are in good agreement. In particular, this formulas for evaluating (ARL0) and (ARL1) be able to get a set of optimal parameters which depend on smoothing parameter (λ) and width of control limit (H) for designing EWMA chart with minimum of (ARL1).
Keywords: Average Run Length1, Optimal parameters, Exponentially Weighted Moving Average (EWMA) control chart.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19841699 Evaluation of Attribute II Bt Sweet Corn Resistance and Reduced-Risk Insecticide Applications for Control of Corn Earworm
Authors: R. Weinzierl, R. Estes, N. Tinsley, M. Keshlaf
Abstract:
The corn earworm, Helicoverpa zea Boddie, is a serious pest of corn. Larval feeding in ear tips destroys kernels and allows growth of fungi and production of mycotoxins. Infested sweet corn is not marketable. Development of improved transgenic hybrids expressing insecticidal toxins from Bacillus thuringiensis (Bt) may limit or prevent crop losses. The effectiveness of Attribute® II Bt resistance and applications of Voliam Xpress insecticide were evaluated for effectiveness in controlling corn earworm in plots near Urbana, IL, USA, in 2013. Where no insecticides were applied, ear infestations and kernel damage in Attribute® II ‘Protector’ plots were consistently lower (near zero) than in plots of the non-Bt isoline ‘Garrison.’ Multiple applications of Voliam Xpress significantly reduced the number of corn earworm larvae and kernel damage in the Garrison plots, but infestations and damage in these plots were greater than in Protectorplots that did not receive insecticide applications. Our results indicate that Attribute® II Bt resistance is more effective than multiple applications of an insecticide for preventing losses caused by corn earworm in sweet corn.
Keywords: Bacillus thuringiensis, Helicoverpa zea, insect pest management, transgenic sweet corn.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22221698 Detection Efficient Enterprises via Data Envelopment Analysis
Authors: S. Turkan
Abstract:
In this paper, the Turkey’s Top 500 Industrial Enterprises data in 2014 were analyzed by data envelopment analysis. Data envelopment analysis is used to detect efficient decision-making units such as universities, hospitals, schools etc. by using inputs and outputs. The decision-making units in this study are enterprises. To detect efficient enterprises, some financial ratios are determined as inputs and outputs. For this reason, financial indicators related to productivity of enterprises are considered. The efficient foreign weighted owned capital enterprises are detected via super efficiency model. According to the results, it is said that Mercedes-Benz is the most efficient foreign weighted owned capital enterprise in Turkey.Keywords: Data envelopment analysis, super efficiency, financial ratios, BCC model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8761697 Image Segmentation Based on Graph Theoretical Approach to Improve the Quality of Image Segmentation
Authors: Deepthi Narayan, Srikanta Murthy K., G. Hemantha Kumar
Abstract:
Graph based image segmentation techniques are considered to be one of the most efficient segmentation techniques which are mainly used as time & space efficient methods for real time applications. How ever, there is need to focus on improving the quality of segmented images obtained from the earlier graph based methods. This paper proposes an improvement to the graph based image segmentation methods already described in the literature. We contribute to the existing method by proposing the use of a weighted Euclidean distance to calculate the edge weight which is the key element in building the graph. We also propose a slight modification of the segmentation method already described in the literature, which results in selection of more prominent edges in the graph. The experimental results show the improvement in the segmentation quality as compared to the methods that already exist, with a slight compromise in efficiency.Keywords: Graph based image segmentation, threshold, Weighted Euclidean distance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15621696 A Study on Local Wisdom towards Career Building of People in Kamchanoad Community
Authors: Phusit Phukamchanoad, Thananya Santithammakul, Suwaree Yordchim, Pennapa Palapin
Abstract:
This research gathered local wisdom towards career building of people in Kamchanoad Community, Baan Muang sub-district, Baan Dung district, Udon Thani province. Data was collected through in-depth interviews with village headmen, community board, teachers, monks, Kamchanoad forest managers and revered elderly aged over 60 years old. All of these 30 interviewees have resided in Kamchanoad Community for more than 40. Descriptive data analysis result revealed that the most prominent local wisdom of Kamchanoad community is their beliefs and religion. Most people in the community have strongly maintained local tradition, the festival of appeasing Chao Pu Sri Suttho on the middle of the 6th month of Thai lunar calendar which falls on the same day with Vesak Day. 100 percent of the people in this community are Buddhist. They believe that Naga, an entity or being, taking the form of a serpent, named “Sri Suttho” lives in Kamchanoad forest. The local people worship the serpent and ask for blessings. Another local wisdom of this community is Sinh fabric weaving.
Keywords: Local Wisdom, Careers, Kamchanoad Community.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16681695 Relaxing Convergence Constraints in Local Priority Hysteresis Switching Logic
Authors: Mubarak Alhajri
Abstract:
This paper addresses certain inherent limitations of local priority hysteresis switching logic. Our main result establishes that under persistent excitation assumption, it is possible to relax constraints requiring strict positivity of local priority and hysteresis switching constants. Relaxing these constraints allows the adaptive system to reach optimality which implies the performance improvement. The unconstrained local priority hysteresis switching logic is examined and conditions for global convergence are derived.Keywords: Adaptive control, convergence, hysteresis constant, hysteresis switching.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8911694 Design of Two-Channel Quadrature Mirror Filter Banks Using Digital All-Pass Filters
Authors: Ju-Hong Lee, Yi-Lin Shieh
Abstract:
The paper deals with the minimax design of two-channel linear-phase (LP) quadrature mirror filter (QMF) banks using infinite impulse response (IIR) digital all-pass filters (DAFs). Based on the theory of two-channel QMF banks using two IIR DAFs, the design problem is appropriately formulated to result in an appropriate Chebyshev approximation for the desired group delay responses of the IIR DAFs and the magnitude response of the low-pass analysis filter. Through a frequency sampling and iterative approximation method, the design problem can be solved by utilizing a weighted least squares approach. The resulting two-channel QMF banks can possess approximately LP response without magnitude distortion. Simulation results are presented for illustration and comparison.
Keywords: Chebyshev approximation, Digital All-Pass Filter, Quadrature Mirror Filter, Weighted Least Squares.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27411693 A New Weighted LDA Method in Comparison to Some Versions of LDA
Authors: Delaram Jarchi, Reza Boostani
Abstract:
Linear Discrimination Analysis (LDA) is a linear solution for classification of two classes. In this paper, we propose a variant LDA method for multi-class problem which redefines the between class and within class scatter matrices by incorporating a weight function into each of them. The aim is to separate classes as much as possible in a situation that one class is well separated from other classes, incidentally, that class must have a little influence on classification. It has been suggested to alleviate influence of classes that are well separated by adding a weight into between class scatter matrix and within class scatter matrix. To obtain a simple and effective weight function, ordinary LDA between every two classes has been used in order to find Fisher discrimination value and passed it as an input into two weight functions and redefined between class and within class scatter matrices. Experimental results showed that our new LDA method improved classification rate, on glass, iris and wine datasets, in comparison to different versions of LDA.Keywords: Discriminant vectors, weighted LDA, uncorrelation, principle components, Fisher-face method, Bootstarp method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15221692 Standard Fuzzy Sets for Aircraft Selection using Multiple Criteria Decision Making Analysis
Authors: C. Ardil
Abstract:
This study uses two-dimensional standard fuzzy sets to enhance multiple criteria decision-making analysis for passenger aircraft selection, allowing decision-makers to express judgments with uncertain and vague information. Using two-dimensional fuzzy numbers, three decision makers evaluated three aircraft alternatives according to seven decision criteria. A validity analysis based on two-dimensional standard fuzzy weighted geometric (SFWG) and two-dimensional standard fuzzy weighted average (SFGA) operators is conducted to test the proposed approach's robustness and effectiveness in the fuzzy multiple criteria decision making (MCDM) evaluation process.
Keywords: Standard fuzzy sets (SFSs), aircraft selection, multiple criteria decision making, intuitionistic fuzzy sets (IFSs), SFWG, SFGA, MCDM
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 391