Search results for: Software Selection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3014

Search results for: Software Selection

2564 Automated Particle Picking based on Correlation Peak Shape Analysis and Iterative Classification

Authors: Hrabe Thomas, Beck Florian, Nickell Stephan

Abstract:

Cryo-electron microscopy (CEM) in combination with single particle analysis (SPA) is a widely used technique for elucidating structural details of macromolecular assemblies at closeto- atomic resolutions. However, development of automated software for SPA processing is still vital since thousands to millions of individual particle images need to be processed. Here, we present our workflow for automated particle picking. Our approach integrates peak shape analysis to the classical correlation and an iterative approach to separate macromolecules and background by classification. This particle selection workflow furthermore provides a robust means for SPA with little user interaction. Processing simulated and experimental data assesses performance of the presented tools.

Keywords: Cryo-electron Microscopy, Single Particle Analysis, Image Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1645
2563 Software Evolution Based Sequence Diagrams Merging

Authors: Zine-Eddine Bouras, Abdelouaheb Talai

Abstract:

The need to merge software artifacts seems inherent to modern software development. Distribution of development over several teams and breaking tasks into smaller, more manageable pieces are an effective means to deal with the kind of complexity. In each case, the separately developed artifacts need to be assembled as efficiently as possible into a consistent whole in which the parts still function as described. In addition, earlier changes are introduced into the life cycle and easier is their management by designers. Interaction-based specifications such as UML sequence diagrams have been found effective in this regard. As a result, sequence diagrams can be used not only for capturing system behaviors but also for merging changes in order to create a new version. The objective of this paper is to suggest a new approach to deal with the problem of software merging at the level of sequence diagrams by using the concept of dependence analysis that captures, formally, all mapping, and differences between elements of sequence diagrams and serves as a key concept to create a new version of sequence diagram.

Keywords: System behaviors, sequence diagram merging, dependence analysis, sequence diagram slicing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1741
2562 A Mapping Approach of Code Generation for Arinc653-Based Avionics Software

Authors: Lu Zou, Dianfu MA, Ying Wang, Xianqi Zhao

Abstract:

Avionic software architecture has transit from a federated avionics architecture to an integrated modular avionics (IMA) .ARINC 653 (Avionics Application Standard Software Interface) is a software specification for space and time partitioning in Safety-critical avionics Real-time operating systems. Methods to transform the abstract avionics application logic function to the executable model have been brought up, however with less consideration about the code generating input and output model specific for ARINC 653 platform and inner-task synchronous dynamic interaction order sequence. In this paper, we proposed an AADL-based model-driven design methodology to fulfill the purpose to automatically generating Cµ executable model on ARINC 653 platform from the ARINC653 architecture which defined as AADL653 in order to facilitate the development of the avionics software constructed on ARINC653 OS. This paper presents the mapping rules between the AADL653 elements and the elements in Cµ language, and define the code generating rules , designs an automatic C µ code generator .Then, we use a case to illustrate our approach. Finally, we give the related work and future research directions.

Keywords: IMA, ARINC653, AADL653, code generation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3008
2561 Analysis of Different Combining Schemes of Two Amplify-Forward Relay Branches with Individual Links Experiencing Nakagami Fading

Authors: Babu Sena Paul, Ratnajit Bhattacharjee

Abstract:

Relay based communication has gained considerable importance in the recent years. In this paper we find the end-toend statistics of a two hop non-regenerative relay branch, each hop being Nakagami-m faded. Closed form expressions for the probability density functions of the signal envelope at the output of a selection combiner and a maximal ratio combiner at the destination node are also derived and analytical formulations are verified through computer simulation. These density functions are useful in evaluating the system performance in terms of bit error rate and outage probability.

Keywords: co-operative diversity, diversity combining, maximal ratio combining, selection combining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1576
2560 Extracting Single Trial Visual Evoked Potentials using Selective Eigen-Rate Principal Components

Authors: Samraj Andrews, Ramaswamy Palaniappan, Nidal Kamel

Abstract:

In single trial analysis, when using Principal Component Analysis (PCA) to extract Visual Evoked Potential (VEP) signals, the selection of principal components (PCs) is an important issue. We propose a new method here that selects only the appropriate PCs. We denote the method as selective eigen-rate (SER). In the method, the VEP is reconstructed based on the rate of the eigen-values of the PCs. When this technique is applied on emulated VEP signals added with background electroencephalogram (EEG), with a focus on extracting the evoked P3 parameter, it is found to be feasible. The improvement in signal to noise ratio (SNR) is superior to two other existing methods of PC selection: Kaiser (KSR) and Residual Power (RP). Though another PC selection method, Spectral Power Ratio (SPR) gives a comparable SNR with high noise factors (i.e. EEGs), SER give more impressive results in such cases. Next, we applied SER method to real VEP signals to analyse the P3 responses for matched and non-matched stimuli. The P3 parameters extracted through our proposed SER method showed higher P3 response for matched stimulus, which confirms to the existing neuroscience knowledge. Single trial PCA using KSR and RP methods failed to indicate any difference for the stimuli.

Keywords: Electroencephalogram, P3, Single trial VEP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1617
2559 Investigation of Organizational Work-Life Imbalance of Thai Software Developers in a Multinational Software Development Firm using Fishbone Diagram for Knowledge Management

Authors: N. Mantalay, N. Chakpitak, W. Janchai, P. Sureepong

Abstract:

Work stress causes the organizational work-life imbalance of employees. Because of this imbalance, workers perform with lower effort to finish assignments and thus an organization will experience reduced productivity. In order to investigate the problem of an organizational work-life imbalance, this qualitative case study focuses on an organizational work-life imbalance among Thai software developers in a German-owned company in Chiang Mai, Thailand. In terms of knowledge management, fishbone diagram is useful analysis tool to investigate the root causes of an organizational work-life imbalance systematically in focus-group discussions. Furthermore, fishbone diagram shows the relationship between causes and effects clearly. It was found that an organizational worklife imbalance among Thai software developers is influenced by management team, work environment, and information tools used in the company over time.

Keywords: knowledge management, knowledge worker, worklife imbalance, fishbone diagram

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2848
2558 A Comparative Study of Additive and Nonparametric Regression Estimators and Variable Selection Procedures

Authors: Adriano Z. Zambom, Preethi Ravikumar

Abstract:

One of the biggest challenges in nonparametric regression is the curse of dimensionality. Additive models are known to overcome this problem by estimating only the individual additive effects of each covariate. However, if the model is misspecified, the accuracy of the estimator compared to the fully nonparametric one is unknown. In this work the efficiency of completely nonparametric regression estimators such as the Loess is compared to the estimators that assume additivity in several situations, including additive and non-additive regression scenarios. The comparison is done by computing the oracle mean square error of the estimators with regards to the true nonparametric regression function. Then, a backward elimination selection procedure based on the Akaike Information Criteria is proposed, which is computed from either the additive or the nonparametric model. Simulations show that if the additive model is misspecified, the percentage of time it fails to select important variables can be higher than that of the fully nonparametric approach. A dimension reduction step is included when nonparametric estimator cannot be computed due to the curse of dimensionality. Finally, the Boston housing dataset is analyzed using the proposed backward elimination procedure and the selected variables are identified.

Keywords: Additive models, local polynomial regression, residuals, mean square error, variable selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 999
2557 ANN-Based Classification of Indirect Immuno Fluorescence Images

Authors: P. Soda, G.Iannello

Abstract:

In this paper we address the issue of classifying the fluorescent intensity of a sample in Indirect Immuno-Fluorescence (IIF). Since IIF is a subjective, semi-quantitative test in its very nature, we discuss a strategy to reliably label the image data set by using the diagnoses performed by different physicians. Then, we discuss image pre-processing, feature extraction and selection. Finally, we propose two ANN-based classifiers that can separate intrinsically dubious samples and whose error tolerance can be flexibly set. Measured performance shows error rates less than 1%, which candidates the method to be used in daily medical practice either to perform pre-selection of cases to be examined, or to act as a second reader.

Keywords: Artificial neural networks, computer aided diagnosis, image classification, indirect immuno-fluorescence, pattern recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1550
2556 Porul: Option Generation and Selection and Scoring Algorithms for a Tamil Flash Card Game

Authors: Anitha Narasimhan, Aarthy Anandan, Madhan Karky, C. N. Subalalitha

Abstract:

Games can be the excellent tools for teaching a language. There are few e-learning games in Indian languages like word scrabble, cross word, quiz games etc., which were developed mainly for educational purposes. This paper proposes a Tamil word game called, “Porul”, which focuses on education as well as on players’ thinking and decision-making skills. Porul is a multiple choice based quiz game, in which the players attempt to answer questions correctly from the given multiple options that are generated using a unique algorithm called the Option Selection algorithm which explores the semantics of the question in various dimensions namely, synonym, rhyme and Universal Networking Language semantic category. This kind of semantic exploration of the question not only increases the complexity of the game but also makes it more interesting. The paper also proposes a Scoring Algorithm which allots a score based on the popularity score of the question word. The proposed game has been tested using 20,000 Tamil words.

Keywords: Porul game, Tamil word game, option selection, flash card, scoring, algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1137
2555 Using PFA in Feature Analysis and Selection for H.264 Adaptation

Authors: Nora A. Naguib, Ahmed E. Hussein, Hesham A. Keshk, Mohamed I. El-Adawy

Abstract:

Classification of video sequences based on their contents is a vital process for adaptation techniques. It helps decide which adaptation technique best fits the resource reduction requested by the client. In this paper we used the principal feature analysis algorithm to select a reduced subset of video features. The main idea is to select only one feature from each class based on the similarities between the features within that class. Our results showed that using this feature reduction technique the source video features can be completely omitted from future classification of video sequences.

Keywords: Adaptation, feature selection, H.264, Principal Feature Analysis (PFA)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1585
2554 A Framework for an Automated Decision Support System for Selecting Safety-Conscious Contractors

Authors: Rawan A. Abdelrazeq, Ahmed M. Khalafallah, Nabil A. Kartam

Abstract:

Selection of competent contractors for construction projects is usually accomplished through competitive bidding or negotiated contracting in which the contract bid price is the basic criterion for selection. The evaluation of contractor’s safety performance is still not a typical criterion in the selection process, despite the existence of various safety prequalification procedures. There is a critical need for practical and automated systems that enable owners and decision makers to evaluate contractor safety performance, among other important contractor selection criteria. These systems should ultimately favor safety-conscious contractors to be selected by the virtue of their past good safety records and current safety programs. This paper presents an exploratory sequential mixed-methods approach to develop a framework for an automated decision support system that evaluates contractor safety performance based on a multitude of indicators and metrics that have been identified through a comprehensive review of construction safety research, and a survey distributed to domain experts. The framework is developed in three phases: (1) determining the indicators that depict contractor current and past safety performance; (2) soliciting input from construction safety experts regarding the identified indicators, their metrics, and relative significance; and (3) designing a decision support system using relational database models to integrate the identified indicators and metrics into a system that assesses and rates the safety performance of contractors. The proposed automated system is expected to hold several advantages including: (1) reducing the likelihood of selecting contractors with poor safety records; (2) enhancing the odds of completing the project safely; and (3) encouraging contractors to exert more efforts to improve their safety performance and practices in order to increase their bid winning opportunities which can lead to significant safety improvements in the construction industry. This should prove useful to decision makers and researchers, alike, and should help improve the safety record of the construction industry.

Keywords: Construction safety, contractor selection, decision support system, relational database.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1566
2553 Parkinsons Disease Classification using Neural Network and Feature Selection

Authors: Anchana Khemphila, Veera Boonjing

Abstract:

In this study, the Multi-Layer Perceptron (MLP)with Back-Propagation learning algorithm are used to classify to effective diagnosis Parkinsons disease(PD).It-s a challenging problem for medical community.Typically characterized by tremor, PD occurs due to the loss of dopamine in the brains thalamic region that results in involuntary or oscillatory movement in the body. A feature selection algorithm along with biomedical test values to diagnose Parkinson disease.Clinical diagnosis is done mostly by doctor-s expertise and experience.But still cases are reported of wrong diagnosis and treatment. Patients are asked to take number of tests for diagnosis.In many cases,not all the tests contribute towards effective diagnosis of a disease.Our work is to classify the presence of Parkinson disease with reduced number of attributes.Original,22 attributes are involved in classify.We use Information Gain to determine the attributes which reduced the number of attributes which is need to be taken from patients.The Artificial neural networks is used to classify the diagnosis of patients.Twenty-Two attributes are reduced to sixteen attributes.The accuracy is in training data set is 82.051% and in the validation data set is 83.333%.

Keywords: Data mining, classification, Parkinson disease, artificial neural networks, feature selection, information gain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3743
2552 Adaptive and Personalizing Learning Sequence Using Modified Roulette Wheel Selection Algorithm

Authors: Melvin A. Ballera

Abstract:

Prior literature in the field of adaptive and personalized learning sequence in e-learning have proposed and implemented various mechanisms to improve the learning process such as individualization and personalization, but complex to implement due to expensive algorithmic programming and need of extensive and prior data. The main objective of personalizing learning sequence is to maximize learning by dynamically selecting the closest teaching operation in order to achieve the learning competency of learner. In this paper, a revolutionary technique has been proposed and tested to perform individualization and personalization using modified reversed roulette wheel selection algorithm that runs at O(n). The technique is simpler to implement and is algorithmically less expensive compared to other revolutionary algorithms since it collects the dynamic real time performance matrix such as examinations, reviews, and study to form the RWSA single numerical fitness value. Results show that the implemented system is capable of recommending new learning sequences that lessens time of study based on student's prior knowledge and real performance matrix.

Keywords: E-learning, fitness value, personalized learning sequence, reversed roulette wheel selection algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1995
2551 Prioritization of Customer Order Selection Factors by Utilizing Conjoint Analysis: A Case Study for a Structural Steel Firm

Authors: Burcu Akyildiz, Cigdem Kadaifci, Y. Ilker Topcu, Burc Ulengin

Abstract:

In today’s business environment, companies should  make strategic decisions to gain sustainable competitive advantage.  Order selection is a crucial issue among these decisions especially for  steel production industry. When the companies allocate a high  proportion of their design and production capacities to their ongoing  projects, determining which customer order should be chosen among  the potential orders without exceeding the remaining capacity is the  major critical problem. In this study, it is aimed to identify and  prioritize the evaluation factors for the customer order selection  problem. Conjoint Analysis is used to examine the importance level  of each factor which is determined as the potential profit rate per unit  of time, the compatibility of potential order with available capacity,  the level of potential future order with higher profit, customer credit  of future business opportunity, and the negotiability level of  production schedule for the order.

 

Keywords: Conjoint analysis, order prioritization, profit management, structural steel firm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2056
2550 A New Method for Complex Goods Selection in Electronic Markets

Authors: Mohammad Ali Tabarzad, Caro Lucas, Nassim Jafarzadeh Eslami

Abstract:

After the development of the Internet a suitable discipline for trading goods electronically has been emerged. However, this type of markets is not still mature enough in order to become independent and get closer to seller/buyer-s needs. Furthermore, the buyable and sellable goods in these markets still don-t have essential standards for being well-defined. In this paper, we will present a model for development of a market which can contain goods with variable definitions and we will also investigate its characteristics. Besides, by noticing the fact that people have different discriminations, it-s figured out that the significance of each attribute of a specific product may vary from different people-s view points. Consequently we-ll present a model for weighting and accordingly different people-s view points could be satisfied. These two aspects will be discussed completely throughout this paper.

Keywords: Electronic markets, selection of multi attributegoods, data infusion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1302
2549 Load Modeling for Power Flow and Transient Stability Computer Studies at BAKHTAR Network

Authors: M. Sedighizadeh, A. Rezazadeh

Abstract:

A method has been developed for preparing load models for power flow and stability. The load modeling (LOADMOD) computer software transforms data on load class mix, composition, and characteristics into the from required for commonly–used power flow and transient stability simulation programs. Typical default data have been developed for load composition and characteristics. This paper defines LOADMOD software and describes the dynamic and static load modeling techniques used in this software and results of initial testing for BAKHTAR power system.

Keywords: Load Modelling, Static, Power Flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2036
2548 Software Maintenance Severity Prediction for Object Oriented Systems

Authors: Parvinder S. Sandhu, Roma Jaswal, Sandeep Khimta, Shailendra Singh

Abstract:

As the majority of faults are found in a few of its modules so there is a need to investigate the modules that are affected severely as compared to other modules and proper maintenance need to be done in time especially for the critical applications. As, Neural networks, which have been already applied in software engineering applications to build reliability growth models predict the gross change or reusability metrics. Neural networks are non-linear sophisticated modeling techniques that are able to model complex functions. Neural network techniques are used when exact nature of input and outputs is not known. A key feature is that they learn the relationship between input and output through training. In this present work, various Neural Network Based techniques are explored and comparative analysis is performed for the prediction of level of need of maintenance by predicting level severity of faults present in NASA-s public domain defect dataset. The comparison of different algorithms is made on the basis of Mean Absolute Error, Root Mean Square Error and Accuracy Values. It is concluded that Generalized Regression Networks is the best algorithm for classification of the software components into different level of severity of impact of the faults. The algorithm can be used to develop model that can be used for identifying modules that are heavily affected by the faults.

Keywords: Neural Network, Software faults, Software Metric.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1553
2547 Design of Domain-Specific Software Systems with Parametric Code Templates

Authors: Kostyantyn Yermashov, Karsten Wolke, Karl Hayo Siemsen

Abstract:

Domain-specific languages describe specific solutions to problems in the application domain. Traditionally they form a solution composing black-box abstractions together. This, usually, involves non-deep transformations over the target model. In this paper we argue that it is potentially powerful to operate with grey-box abstractions to build a domain-specific software system. We present parametric code templates as grey-box abstractions and conceptual tools to encapsulate and manipulate these templates. Manipulations introduce template-s merging routines and can be defined in a generic way. This involves reasoning mechanisms at the code templates level. We introduce the concept of Neurath Modelling Language (NML) that operates with parametric code templates and specifies a visualisation mapping mechanism for target models. Finally we provide an example of calculating a domain-specific software system with predefined NML elements.

Keywords: software design, code templates, domain-specific languages, modelling languages, generic tools

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1375
2546 Comparison of the Distillation Curve Obtained Experimentally with the Curve Extrapolated by a Commercial Simulator

Authors: Lívia B. Meirelles, Erika C. A. N. Chrisman, Flávia B. de Andrade, Lilian C. M. de Oliveira

Abstract:

True Boiling Point distillation (TBP) is one of the most common experimental techniques for the determination of petroleum properties. This curve provides information about the performance of petroleum in terms of its cuts. The experiment is performed in a few days. Techniques are used to determine the properties faster with a software that calculates the distillation curve when a little information about crude oil is known. In order to evaluate the accuracy of distillation curve prediction, eight points of the TBP curve and specific gravity curve (348 K and 523 K) were inserted into the HYSYS Oil Manager, and the extended curve was evaluated up to 748 K. The methods were able to predict the curve with the accuracy of 0.6%-9.2% error (Software X ASTM), 0.2%-5.1% error (Software X Spaltrohr).

Keywords: Distillation curve, petroleum distillation, simulation, true boiling point curve.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1588
2545 File Format of Flow Chart Simulation Software - CFlow

Authors: Syahanim Mohd Salleh, Zaihosnita Hood, Hairulliza Mohd Judi, Marini Abu Bakar

Abstract:

CFlow is a flow chart software, it contains facilities to draw and evaluate a flow chart. A flow chart evaluation applies a simulation method to enable presentation of work flow in a flow chart solution. Flow chart simulation of CFlow is executed by manipulating the CFlow data file which is saved in a graphical vector format. These text-based data are organised by using a data classification technic based on a Library classification-scheme. This paper describes the file format for flow chart simulation software of CFlow.

Keywords: CFlow, flow chart, file format.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2523
2544 Software Reengineering Tool for Traffic Accident Data

Authors: Jagdeep Kaur, Parvinder S. Sandhu, Birinderjit Singh, Amit Verma, Sanyam Anand

Abstract:

In today-s hip hop world where everyone is running short of time and works hap hazardly,the similar scene is common on the roads while in traffic.To do away with the fatal consequences of such speedy traffics on rushy lanes, a software to analyse and keep account of the traffic and subsequent conjestion is being used in the developed countries. This software has being implemented and used with the help of a suppprt tool called Critical Analysis Reporting Environment.There has been two existing versions of this tool.The current research paper involves examining the issues and probles while using these two practically. Further a hybrid architecture is proposed for the same that retains the quality and performance of both and is better in terms of coupling of components , maintainence and many other features.

Keywords: Critical Analysis Reporting Environment, coupling, hybrid architecture etc.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1506
2543 Site Selection of Public Parking in Isfahan City, using AHP Model

Authors: M. Ahmadi Baseri, R. Mokhtari Malekabadi, A. Gandomkar

Abstract:

Nowadays, one of the most important problems of the metropolises and the world large cities is the habitant traffic difficulty and lack of sufficient parking site for the vehicles. Esfahan city as the third metropolis of Iran has encountered with the vehicles parkingplace problems in the most parts of fourteen regions of the city. The non principled and non systematic dispersal and lack of parking sites in the city has created an unfavorable status for its traffic and has caused the air and sound pollutions increase; in addition, it wastes the most portions of the citizenship and travelers' charge and time in urban pathways and disturbs their mental and psychical calmness, thus leads to their intensive dissatisfaction. In this study, by the usage of AHP model in GIS environment, the effective criteria in selecting the public parking sites have been combined with each other, and the results of the created layers overlapping represent the parking utilitarian vastness and widths. The achieved results of this research indicate the pretty appropriate public parking sites selection in region number 3 of Esfahan; but inconsequential dispersal and lack of these parking sites in this region have caused abundant transportation problems in Esfahan city.

Keywords: Public parking lots, Parking site selection, Geographical Information System (GIS), Hierarchical Analysis Model, Isfahan city.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2368
2542 The Use of Chlorophyll Meter Readings for the Selection of Maize Inbred Lines under Drought Stress

Authors: F. Gekas, C. Pankou, I. Mylonas, E. Ninou, E. Sinapidou, A. Lithourgidis, F. Papathanasiou, J. –K. Petrevska, F. Papadopoulou, P. Zouliamis, G. Tsaprounis, I. Tokatlidis, C. Dordas

Abstract:

The present study aimed to investigate whether chlorophyll meter readings (SPAD) can be used as criterion of singleplant selection in maize breeding. Experimentation was performed at the ultra-low density of 0.74 plants/m2 in order the potential yield per plant to be fully expressed. R-31 honeycomb experiments were conducted in three different areas in Greece (Thessaloniki, Giannitsa and Florina) using 30 inbred lines at well-watered and water-stressed conditions during the 2012 growing season. The chlorophyll meter readings had higher rates at dry conditions, except location of Giannitsa where differences were not significant. Genotypes of highest chlorophyll meter readings were consistent across areas, emphasizing on the character’s stability. A positive correlation between the chlorophyll meter readings and grain yield was strengthening over time and culminated at the physiological maturity stage. There was a clear sign that the chlorophyll meter readings has the potential to be used for the selection of stress-adaptive genotypes and may permit modern maize to be grown at wider range of environments addressing the climate change scenarios.

Keywords: Drought-prone environments, honeycomb breeding, SPAD, Zea mays.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2821
2541 A Subtractive Clustering Based Approach for Early Prediction of Fault Proneness in Software Modules

Authors: Ramandeep S. Sidhu, Sunil Khullar, Parvinder S. Sandhu, R. P. S. Bedi, Kiranbir Kaur

Abstract:

In this paper, subtractive clustering based fuzzy inference system approach is used for early detection of faults in the function oriented software systems. This approach has been tested with real time defect datasets of NASA software projects named as PC1 and CM1. Both the code based model and joined model (combination of the requirement and code based metrics) of the datasets are used for training and testing of the proposed approach. The performance of the models is recorded in terms of Accuracy, MAE and RMSE values. The performance of the proposed approach is better in case of Joined Model. As evidenced from the results obtained it can be concluded that Clustering and fuzzy logic together provide a simple yet powerful means to model the earlier detection of faults in the function oriented software systems.

Keywords: Subtractive clustering, fuzzy inference system, fault proneness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2559
2540 Selection the Optimum Cooling Scheme for Generators based on the Electro-Thermal Analysis

Authors: Diako Azizi, Ahmad Gholami, Vahid Abbasi

Abstract:

Optimal selection of electrical insulations in electrical machinery insures reliability during operation. From the insulation studies of view for electrical machines, stator is the most important part. This fact reveals the requirement for inspection of the electrical machine insulation along with the electro-thermal stresses. In the first step of the study, a part of the whole structure of machine in which covers the general characteristics of the machine is chosen, then based on the electromagnetic analysis (finite element method), the machine operation is simulated. In the simulation results, the temperature distribution of the total structure is presented simultaneously by using electro-thermal analysis. The results of electro-thermal analysis can be used for designing an optimal cooling system. In order to design, review and comparing the cooling systems, four wiring structures in the slots of Stator are presented. The structures are compared to each other in terms of electrical, thermal distribution and remaining life of insulation by using Finite Element analysis. According to the steps of the study, an optimization algorithm has been presented for selection of appropriate structure.

Keywords: Electrical field, field distribution, insulation, winding, finite element method, electro thermal

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1727
2539 Fighter Aircraft Selection Using Fuzzy Preference Optimization Programming (POP)

Authors: C. Ardil

Abstract:

The Turkish Air Force needs to acquire a sixth- generation fighter aircraft in order to maintain its air superiority and dominance against its rivals under the risks posed by global geopolitical opportunities and threats. Accordingly, five evaluation criteria were determined to evaluate the sixth-generation fighter aircraft alternatives and to select the best one. Systematically, a new fuzzy preference optimization programming (POP) method is proposed to select the best sixth generation fighter aircraft in an uncertain environment. The POP technique considers both quantitative and qualitative evaluation criteria. To demonstrate the applicability and effectiveness of the proposed approach, it is applied to a multiple criteria decision-making problem to evaluate and select sixth-generation fighter aircraft. The results of the fuzzy POP method are compared with the results of the fuzzy TOPSIS approach to validate it. According to the comparative analysis, fuzzy POP and fuzzy TOPSIS methods get the same results. This demonstrates the applicability of the fuzzy POP technique to address the sixth-generation fighter selection problem.

Keywords: Fighter aircraft selection, sixth-generation fighter aircraft, fuzzy decision process, multiple criteria decision making, preference optimization programming, POP, TOPSIS, Kizilelma, MIUS, fuzzy set theory

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 416
2538 Web Application for University Internship Program Management

Authors: Prasanth Sabarish Nair, Thomas Binu, Madiajagan Muthaiyan

Abstract:

This paper discusses a software application to aid in the smooth functioning of a university internship program, including a student, faculty and an administration module. The software can also calculate the most apt combination of students to stations and allocate them respectively.

Keywords: Academic evaluation, administration monitoring, automatic allocation system, internship, student preferences.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1928
2537 An Agent Based Dynamic Resource Scheduling Model with FCFS-Job Grouping Strategy in Grid Computing

Authors: Raksha Sharma, Vishnu Kant Soni, Manoj Kumar Mishra, Prachet Bhuyan, Utpal Chandra Dey

Abstract:

Grid computing is a group of clusters connected over high-speed networks that involves coordinating and sharing computational power, data storage and network resources operating across dynamic and geographically dispersed locations. Resource management and job scheduling are critical tasks in grid computing. Resource selection becomes challenging due to heterogeneity and dynamic availability of resources. Job scheduling is a NP-complete problem and different heuristics may be used to reach an optimal or near optimal solution. This paper proposes a model for resource and job scheduling in dynamic grid environment. The main focus is to maximize the resource utilization and minimize processing time of jobs. Grid resource selection strategy is based on Max Heap Tree (MHT) that best suits for large scale application and root node of MHT is selected for job submission. Job grouping concept is used to maximize resource utilization for scheduling of jobs in grid computing. Proposed resource selection model and job grouping concept are used to enhance scalability, robustness, efficiency and load balancing ability of the grid.

Keywords: Agent, Grid Computing, Job Grouping, Max Heap Tree (MHT), Resource Scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2067
2536 Manual Testing of Web Software Systems Supported by Direct Guidance of the Tester Based On Design Model

Authors: Karel Frajtak, Miroslav Bures, Ivan Jelinek

Abstract:

Software testing is important stage of software development cycle. Current testing process involves tester and electronic documents with test case scenarios. In this paper we focus on new approach to testing process using automated test case generation and tester guidance through the system based on the model of the system. Test case generation and model-based testing is not possible without proper system model. We aim on providing better feedback from the testing process thus eliminating the unnecessary paper work.

Keywords: Model based testing, test automation, test generating, tester support.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1938
2535 Increasing Profitability Supported by Innovative Methods and Designing Monitoring Software in Condition-Based Maintenance: A Case Study

Authors: Nasrin Farajiparvar

Abstract:

In the present article, a new method has been developed to enhance the application of equipment monitoring, which in turn results in improving condition-based maintenance economic impact in an automobile parts manufacturing factory. This study also describes how an effective software with a simple database can be utilized to achieve cost-effective improvements in maintenance performance. The most important results of this project are indicated here: 1. 63% reduction in direct and indirect maintenance costs. 2. Creating a proper database to analyse failures. 3. Creating a method to control system performance and develop it to similar systems. 4. Designing a software to analyse database and consequently create technical knowledge to face unusual condition of the system. Moreover, the results of this study have shown that the concept and philosophy of maintenance has not been understood in most Iranian industries. Thus, more investment is strongly required to improve maintenance conditions.

Keywords: Condition-based maintenance, Economic savings, Iran industries, Machine life prediction software.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1551