Search results for: Subset Selection.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1091

Search results for: Subset Selection.

701 A Flexible Flowshop Scheduling Problem with Machine Eligibility Constraint and Two Criteria Objective Function

Authors: Bita Tadayon, Nasser Salmasi

Abstract:

This research deals with a flexible flowshop scheduling problem with arrival and delivery of jobs in groups and processing them individually. Due to the special characteristics of each job, only a subset of machines in each stage is eligible to process that job. The objective function deals with minimization of sum of the completion time of groups on one hand and minimization of sum of the differences between completion time of jobs and delivery time of the group containing that job (waiting period) on the other hand. The problem can be stated as FFc / rj , Mj / irreg which has many applications in production and service industries. A mathematical model is proposed, the problem is proved to be NPcomplete, and an effective heuristic method is presented to schedule the jobs efficiently. This algorithm can then be used within the body of any metaheuristic algorithm for solving the problem.

Keywords: flexible flowshop scheduling, group processing, machine eligibility constraint, mathematical modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1797
700 Investigating the Effective Parameters in Determining the Type of Traffic Congestion Pricing Schemes in Urban Streets

Authors: Saeed Sayyad Hagh Shomar

Abstract:

Traffic congestion pricing – as a strategy in travel demand management in urban areas to reduce traffic congestion, air pollution and noise pollution – has drawn many attentions towards itself. Unlike the satisfying findings in this method, there are still problems in determining the best functional congestion pricing scheme with regard to the situation. The so-called problems in this process will result in further complications and even the scheme failure. That is why having proper knowledge of the significance of congestion pricing schemes and the effective factors in choosing them can lead to the success of this strategy. In this study, first, a variety of traffic congestion pricing schemes and their components are introduced; then, their functional usage is discussed. Next, by analyzing and comparing the barriers, limitations and advantages, the selection criteria of pricing schemes are described. The results, accordingly, show that the selection of the best scheme depends on various parameters. Finally, based on examining the effective parameters, it is concluded that the implementation of area-based schemes (cordon and zonal) has been more successful in non-diversion of traffic. That is considering the topology of the cities and the fact that traffic congestion is often created in the city centers, area-based schemes would be notably functional and appropriate.

Keywords: Congestion pricing, demand management, flat toll, variable toll.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 572
699 Wheat Yield Prediction through Agro Meteorological Indices for Ardebil District

Authors: Fariba Esfandiary, Ghafoor Aghaie, Ali Dolati Mehr

Abstract:

Wheat prediction was carried out using different meteorological variables together with agro meteorological indices in Ardebil district for the years 2004-2005 & 2005–2006. On the basis of correlation coefficients, standard error of estimate as well as relative deviation of predicted yield from actual yield using different statistical models, the best subset of agro meteorological indices were selected including daily minimum temperature (Tmin), accumulated difference of maximum & minimum temperatures (TD), growing degree days (GDD), accumulated water vapor pressure deficit (VPD), sunshine hours (SH) & potential evapotranspiration (PET). Yield prediction was done two months in advance before harvesting time which was coincide with commencement of reproductive stage of wheat (5th of June). It revealed that in the final statistical models, 83% of wheat yield variability was accounted for variation in above agro meteorological indices.

Keywords: Wheat yields prediction, agro meteorological indices, statistical models

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2107
698 Design of Liquids Mixing Control System using Fuzzy Time Control Discrete Event Model for Industrial Applications

Authors: M.Saleem Khan, Khaled Benkrid

Abstract:

This paper presents a time control liquids mixing system in the tanks as an application of fuzzy time control discrete model. The system is designed for a wide range of industrial applications. The simulation design of control system has three inputs: volume, viscosity, and selection of product, along with the three external control adjustments for the system calibration or to take over the control of the system autonomously in local or distributed environment. There are four controlling elements: rotatory motor, grinding motor, heating and cooling units, and valves selection, each with time frame limit. The system consists of three controlled variables measurement through its sensing mechanism for feed back control. This design also facilitates the liquids mixing system to grind certain materials in tanks and mix with fluids under required temperature controlled environment to achieve certain viscous level. Design of: fuzzifier, inference engine, rule base, deffuzifiers, and discrete event control system, is discussed. Time control fuzzy rules are formulated, applied and tested using MATLAB simulation for the system.

Keywords: Fuzzy time control, industrial application and timecontrol systems, adjustment of Fuzzy system, liquids mixing system, design of fuzzy time control DEV system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2502
697 Heterogeneous Attribute Reduction in Noisy System based on a Generalized Neighborhood Rough Sets Model

Authors: Siyuan Jing, Kun She

Abstract:

Neighborhood Rough Sets (NRS) has been proven to be an efficient tool for heterogeneous attribute reduction. However, most of researches are focused on dealing with complete and noiseless data. Factually, most of the information systems are noisy, namely, filled with incomplete data and inconsistent data. In this paper, we introduce a generalized neighborhood rough sets model, called VPTNRS, to deal with the problem of heterogeneous attribute reduction in noisy system. We generalize classical NRS model with tolerance neighborhood relation and the probabilistic theory. Furthermore, we use the neighborhood dependency to evaluate the significance of a subset of heterogeneous attributes and construct a forward greedy algorithm for attribute reduction based on it. Experimental results show that the model is efficient to deal with noisy data.

Keywords: attribute reduction, incomplete data, inconsistent data, tolerance neighborhood relation, rough sets

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1562
696 Secure Secret Recovery by using Weighted Personal Entropy

Authors: Leau Y. B., Dinna Nina M. N., Habeeb S. A. H., Jetol B.

Abstract:

Authentication plays a vital role in many secure systems. Most of these systems require user to log in with his or her secret password or pass phrase before entering it. This is to ensure all the valuables information is kept confidential guaranteeing also its integrity and availability. However, to achieve this goal, users are required to memorize high entropy passwords or pass phrases. Unfortunately, this sometimes causes difficulty for user to remember meaningless strings of data. This paper presents a new scheme which assigns a weight to each personal question given to the user in revealing the encrypted secrets or password. Concentration of this scheme is to offer fault tolerance to users by allowing them to forget the specific password to a subset of questions and still recover the secret and achieve successful authentication. Comparison on level of security for weight-based and weightless secret recovery scheme is also discussed. The paper concludes with the few areas that requires more investigation in this research.

Keywords: Secret Recovery, Personal Entropy, Cryptography, Secret Sharing and Key Management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1933
695 Attribute Based Comparison and Selection of Modular Self-Reconfigurable Robot Using Multiple Attribute Decision Making Approach

Authors: Manpreet Singh, V. P. Agrawal, Gurmanjot Singh Bhatti

Abstract:

From the last decades, there is a significant technological advancement in the field of robotics, and a number of modular self-reconfigurable robots were introduced that can help in space exploration, bucket to stuff, search, and rescue operation during earthquake, etc. As there are numbers of self-reconfigurable robots, choosing the optimum one is always a concern for robot user since there is an increase in available features, facilities, complexity, etc. The objective of this research work is to present a multiple attribute decision making based methodology for coding, evaluation, comparison ranking and selection of modular self-reconfigurable robots using a technique for order preferences by similarity to ideal solution approach. However, 86 attributes that affect the structure and performance are identified. A database for modular self-reconfigurable robot on the basis of different pertinent attribute is generated. This database is very useful for the user, for selecting a robot that suits their operational needs. Two visual methods namely linear graph and spider chart are proposed for ranking of modular self-reconfigurable robots. Using five robots (Atron, Smores, Polybot, M-Tran 3, Superbot), an example is illustrated, and raking of the robots is successfully done, which shows that Smores is the best robot for the operational need illustrated, and this methodology is found to be very effective and simple to use.

Keywords: Self-reconfigurable robots, MADM, TOPSIS, morphogenesis, scalability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 847
694 The Optimal Equilibrium Capacity of Information Hiding Based on Game Theory

Authors: Ziquan Hu, Kun She, Shahzad Ali, Kai Yan

Abstract:

Game theory could be used to analyze the conflicted issues in the field of information hiding. In this paper, 2-phase game can be used to build the embedder-attacker system to analyze the limits of hiding capacity of embedding algorithms: the embedder minimizes the expected damage and the attacker maximizes it. In the system, the embedder first consumes its resource to build embedded units (EU) and insert the secret information into EU. Then the attacker distributes its resource evenly to the attacked EU. The expected equilibrium damage, which is maximum damage in value from the point of view of the attacker and minimum from the embedder against the attacker, is evaluated by the case when the attacker attacks a subset from all the EU. Furthermore, the optimal equilibrium capacity of hiding information is calculated through the optimal number of EU with the embedded secret information. Finally, illustrative examples of the optimal equilibrium capacity are presented.

Keywords: 2-Phase Game, Expected Equilibrium damage, InformationHiding, Optimal Equilibrium Capacity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1588
693 Modeling the Country Selection Decision in Retail Internationalization

Authors: A. Hortacsu, A. Tektas

Abstract:

This paper aims to develop a model that assists the international retailer in selecting the country that maximizes the degree of fit between the retailer-s goals and the country characteristics in his initial internationalization move. A two-stage multi criteria decision model is designed integrating the Analytic Hierarchy Process (AHP) and Goal Programming. Ethical, cultural, geographic and economic proximity are identified as the relevant constructs of the internationalization decision. The constructs are further structured into sub-factors within analytic hierarchy. The model helps the retailer to integrate, rank and weigh a number of hard and soft factors and prioritize the countries accordingly. The model has been implemented on a Turkish luxury goods retailer who was planning to internationalize. Actual entry of the specific retailer in the selected country is a support for the model. Implementation on a single retailer limits the generalizability of the results; however, the emphasis of the paper is on construct identification and model development. The paper enriches the existing literature by proposing a hybrid multi objective decision model which introduces new soft dimensions i.e. perceived distance, ethical proximity, humane orientation to the decision process and facilitates effective decision making.

Keywords: Analytic hierarchy process, culture, ethics, goal programming, retail foreign market selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2313
692 Evaluation of Internal Ballistics of Multi-Perforated Grain in a Closed Vessel

Authors: B. A. Parate, C. P. Shetty

Abstract:

This research article describes the evaluation methodology of an internal ballistics of multi-perforated grain in a closed vessel (CV). The propellant testing in a CV is conducted to characterize the propellants and to ascertain the various internal ballistic parameters. The assessment of an internal ballistics plays a very crucial role for suitability of its use in the selection for a given particular application. The propellant used in defense sectors has to satisfy the user requirements as per laid down specifications. The outputs from CV evaluation of multi-propellant grain are maximum pressure of 226.75 MPa, differentiation of pressure with respect to time of 36.99 MPa/ms, average vivacity of 9.990×10-4/MPa ms, force constant of 933.9 J/g, rise time of 9.85 ms, pressure index of 0.878 including burning coefficient of 0.2919. This paper addresses an internal ballistic of multi-perforated grain, propellant selection, its calculation, and evaluation of various parameters in a CV testing. For the current analysis, the propellant is evaluated in 100 cc CV with propellant mass 20 g. The loading density of propellant is 0.2 g/cc. The method for determination of internal ballistic properties consists of burning of propellant mass under constant volume.

Keywords: Burning rate, closed vessel, force constant, internal ballistic, loading density, maximum pressure, multi-propellant grain, propellant, rise time, vivacity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 302
691 Coverage and Connectivity Problem in Sensor Networks

Authors: Meenakshi Bansal, Iqbal Singh, Parvinder S. Sandhu

Abstract:

In over deployed sensor networks, one approach to Conserve energy is to keep only a small subset of sensors active at Any instant. For the coverage problems, the monitoring area in a set of points that require sensing, called demand points, and consider that the node coverage area is a circle of range R, where R is the sensing range, If the Distance between a demand point and a sensor node is less than R, the node is able to cover this point. We consider a wireless sensor network consisting of a set of sensors deployed randomly. A point in the monitored area is covered if it is within the sensing range of a sensor. In some applications, when the network is sufficiently dense, area coverage can be approximated by guaranteeing point coverage. In this case, all the points of wireless devices could be used to represent the whole area, and the working sensors are supposed to cover all the sensors. We also introduce Hybrid Algorithm and challenges related to coverage in sensor networks.

Keywords: Wireless sensor networks, network coverage, Energy conservation, Hybrid Algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1688
690 Fuzzy Uncertainty Theory for Stealth Fighter Aircraft Selection in Entropic Fuzzy TOPSIS Decision Analysis Process

Authors: C. Ardil

Abstract:

The purpose of this paper is to present fuzzy TOPSIS in an entropic fuzzy environment. Due to the ambiguous concepts often represented in decision data, exact values are insufficient to model real-life situations. In this paper, the rating of each alternative is defined in fuzzy linguistic terms, which can be expressed with triangular fuzzy numbers. The weight of each criterion is then derived from the decision matrix using the entropy weighting method. Next, a vertex method is proposed to calculate the distance between two triangular fuzzy numbers. According to the TOPSIS concept, a closeness coefficient is defined to determine the ranking order of all alternatives by simultaneously calculating the distances to both the fuzzy positive-ideal solution (FPIS) and the fuzzy negative-ideal solution (FNIS). Finally, an illustrative example of selecting stealth fighter aircraft is shown at the end of this article to highlight the procedure of the proposed method. Correlation analysis and validation analysis using TOPSIS, WSM, and WPM methods were performed to compare the ranking order of the alternatives.

Keywords: stealth fighter aircraft selection, fuzzy uncertainty theory (FUT), fuzzy entropic decision (FED), fuzzy linguistic variables, triangular fuzzy numbers, multiple criteria decision making analysis, MCDMA, TOPSIS, WSM, WPM

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 531
689 Optimal Portfolio Selection in a DC Pension with Multiple Contributors and the Impact of Stochastic Additional Voluntary Contribution on the Optimal Investment Strategy

Authors: Edikan E. Akpanibah, Okwigbedi Oghen’Oro

Abstract:

In this paper, we studied the optimal portfolio selection in a defined contribution (DC) pension scheme with multiple contributors under constant elasticity of variance (CEV) model and the impact of stochastic additional voluntary contribution on the investment strategies. We assume that the voluntary contributions are stochastic and also consider investments in a risk free asset and a risky asset to increase the expected returns of the contributing members. We derived a stochastic differential equation which consists of the members’ monthly contributions and the invested fund and obtained an optimized problem with the help of Hamilton Jacobi Bellman equation. Furthermore, we find an explicit solution for the optimal investment strategy with stochastic voluntary contribution using power transformation and change of variables method and the corresponding optimal fund size was obtained. We discussed the impact of the voluntary contribution on the optimal investment strategy with numerical simulations and observed that the voluntary contribution reduces the optimal investment strategy of the risky asset.

Keywords: DC pension fund, Hamilton-Jacobi-Bellman, optimal investment strategies, power transformation method, stochastic, voluntary contribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 797
688 Practical Evaluation of High-Efficiency Si-Based Tandem Solar Cells

Authors: Sue-Yi Chen, Wei-Chun Hsu, Jon-Yiew Gan

Abstract:

Si-based double-junction tandem solar cells have become a popular research topic because of the advantages of low manufacturing cost and high energy conversion efficiency. However, there is no set of calculations to select the appropriate top cell materials. Therefore, this paper will propose a simple but practical selection method. First of all, we calculate the S-Q limit and explain the reasons for developing tandem solar cells. Secondly, we calculate the theoretical energy conversion efficiency of the double-junction tandem solar cells while combining the commercial monocrystalline Si and materials' practical efficiency to consider the actual situation. Finally, we conservatively conclude that if considering 75% performance of the theoretical energy conversion efficiency of the top cell, the suitable bandgap energy range will fall between 1.38 eV to 2.5 eV. Besides, we also briefly describe some improvements of several proper materials, CZTS, CdSe, Cu2O, ZnTe, and CdS, hoping that future research can select and manufacture high-efficiency Si-based tandem solar cells based on this paper successfully. Most importantly, our calculation method is not limited to silicon solely. If other materials’ performances match or surpass silicon's ability in the future, researchers can also apply this set of deduction processes.

Keywords: High-efficiency solar cells, material selection, Si-based double-junction solar cells, tandem solar cells, photovoltaics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 466
687 Expression of Leucaena Leucocephala de Wit Chitinase in Transgenic Koshihikari Rice

Authors: M. Kaomek, J. R. Ketudat-Cairns

Abstract:

The cDNA encoding the 326 amino acids of a Class I basic chitinase gene from Leucaena leucocephala de Wit (KB3, Genbank accession: AAM49597) was cloned under the control of CaMV35S promoter in pCAMBIA 1300 and transferred to Koshihikari. Calli of Koshihikari rice was transformed with agrobacterium with this construct expressing the chitinase and β- glucouronidase (GUS). The frequencies of calli 90 % has been obtained from rice seedlings cultured on NB medium. The high regeneration frequencies, 74% was obtained from calli cultured on regeneration medium containing 4 mg/l BAP, and 7 g/l phytagel at 25°C. Various factors were studied in order to establish a procedure for the transformation of Koshihikari Agrobacterium tumefaciens. Supplementation of 50 mM acetosyringone to the medium during coculivation was important to enhance the frequency to transient transformation. The 4 week-old scutellum-derived calli were excellent starting materials. Selection medium based on NB medium supplement with 40 mg/l hygromycin and 400 mg/l cefotaxime were an optimized medium for selection of transformed rice calli. The percentage of transformation 70 was obtained. Recombinant calli and regenerated rice plants were checked the expression of chitinase and gus by PCR, northern blot gel, southern blot gel, and gus assay. Chitinase and gus were expressed in all parts of recombinant rice. The rice line expressing the KB3 chiitnase was more resistant to the blast fungus Fusarium monoliforme than control line.

Keywords: chitinase, Leucaena leucocephala de Wit, Koshihikari, transgenic rice.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1544
686 The Feedback Control for Distributed Systems

Authors: Kamil Aida-zade, C. Ardil

Abstract:

We study the problem of synthesis of lumped sources control for the objects with distributed parameters on the basis of continuous observation of phase state at given points of object. In the proposed approach the phase state space (phase space) is beforehand somehow partitioned at observable points into given subsets (zones). The synthesizing control actions therewith are taken from the class of piecewise constant functions. The current values of control actions are determined by the subset of phase space that contains the aggregate of current states of object at the observable points (in these states control actions take constant values). In the paper such synthesized control actions are called zone control actions. A technique to obtain optimal values of zone control actions with the use of smooth optimization methods is given. With this aim, the formulas of objective functional gradient in the space of zone control actions are obtained.

Keywords: Feedback control, distributed systems, smooth optimization methods, lumped control synthesis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 570
685 Transform-Domain Rate-Distortion Optimization Accelerator for H.264/AVC Video Encoding

Authors: Mohammed Golam Sarwer, Lai Man Po, Kai Guo, Q.M. Jonathan Wu

Abstract:

In H.264/AVC video encoding, rate-distortion optimization for mode selection plays a significant role to achieve outstanding performance in compression efficiency and video quality. However, this mode selection process also makes the encoding process extremely complex, especially in the computation of the ratedistortion cost function, which includes the computations of the sum of squared difference (SSD) between the original and reconstructed image blocks and context-based entropy coding of the block. In this paper, a transform-domain rate-distortion optimization accelerator based on fast SSD (FSSD) and VLC-based rate estimation algorithm is proposed. This algorithm could significantly simplify the hardware architecture for the rate-distortion cost computation with only ignorable performance degradation. An efficient hardware structure for implementing the proposed transform-domain rate-distortion optimization accelerator is also proposed. Simulation results demonstrated that the proposed algorithm reduces about 47% of total encoding time with negligible degradation of coding performance. The proposed method can be easily applied to many mobile video application areas such as a digital camera and a DMB (Digital Multimedia Broadcasting) phone.

Keywords: Context-adaptive variable length coding (CAVLC), H.264/AVC, rate-distortion optimization (RDO), sum of squareddifference (SSD).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1576
684 Different in Factors of the Distributor Selection for Food and Non-Food OTOP Entrepreneur in Thailand

Authors: Phutthiwat Waiyawuththanapoom

Abstract:

This study has only one objective which is to identify the different in factors of choosing the distributor for food and non-food OTOP entrepreneur in Thailand. In this research, the types of OTOP product will be divided into two groups which are food and non-food. The sample for the food type OTOP product was the processed fruit and vegetable from Nakorn Pathom province and the sample for the non-food type OTOP product was the court doll from Ang Thong province. The research was divided into 3 parts which were a study of the distribution pattern and how to choose the distributor of the food type OTOP product, a study of the distribution pattern and how to choose the distributor of the non-food type OTOP product and a comparison between 2 types of products to find the differentiation in the factor of choosing distributor. The data and information was collected by using the interview. The populations in the research were 5 producers of the processed fruit and vegetable from Nakorn Pathom province and 5 producers of the court doll from Ang Thong province. The significant factor in choosing the distributor of the food type OTOP product is the material handling efficiency and on-time delivery but for the non-food type OTOP product is focused on the channel of distribution and cost of the distributor.

Keywords: Distributor, OTOP, Food and Non-Food, Selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1578
683 Image Retrieval Based on Multi-Feature Fusion for Heterogeneous Image Databases

Authors: N. W. U. D. Chathurani, Shlomo Geva, Vinod Chandran, Proboda Rajapaksha

Abstract:

Selecting an appropriate image representation is the most important factor in implementing an effective Content-Based Image Retrieval (CBIR) system. This paper presents a multi-feature fusion approach for efficient CBIR, based on the distance distribution of features and relative feature weights at the time of query processing. It is a simple yet effective approach, which is free from the effect of features' dimensions, ranges, internal feature normalization and the distance measure. This approach can easily be adopted in any feature combination to improve retrieval quality. The proposed approach is empirically evaluated using two benchmark datasets for image classification (a subset of the Corel dataset and Oliva and Torralba) and compared with existing approaches. The performance of the proposed approach is confirmed with the significantly improved performance in comparison with the independently evaluated baseline of the previously proposed feature fusion approaches.

Keywords: Feature fusion, image retrieval, membership function, normalization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1314
682 Bayesian Belief Networks for Test Driven Development

Authors: Vijayalakshmy Periaswamy S., Kevin McDaid

Abstract:

Testing accounts for the major percentage of technical contribution in the software development process. Typically, it consumes more than 50 percent of the total cost of developing a piece of software. The selection of software tests is a very important activity within this process to ensure the software reliability requirements are met. Generally tests are run to achieve maximum coverage of the software code and very little attention is given to the achieved reliability of the software. Using an existing methodology, this paper describes how to use Bayesian Belief Networks (BBNs) to select unit tests based on their contribution to the reliability of the module under consideration. In particular the work examines how the approach can enhance test-first development by assessing the quality of test suites resulting from this development methodology and providing insight into additional tests that can significantly reduce the achieved reliability. In this way the method can produce an optimal selection of inputs and the order in which the tests are executed to maximize the software reliability. To illustrate this approach, a belief network is constructed for a modern software system incorporating the expert opinion, expressed through probabilities of the relative quality of the elements of the software, and the potential effectiveness of the software tests. The steps involved in constructing the Bayesian Network are explained as is a method to allow for the test suite resulting from test-driven development.

Keywords: Software testing, Test Driven Development, Bayesian Belief Networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1853
681 Statistical Measures and Optimization Algorithms for Gene Selection in Lung and Ovarian Tumor

Authors: C. Gunavathi, K. Premalatha

Abstract:

Microarray technology is universally used in the study of disease diagnosis using gene expression levels. The main shortcoming of gene expression data is that it includes thousands of genes and a small number of samples. Abundant methods and techniques have been proposed for tumor classification using microarray gene expression data. Feature or gene selection methods can be used to mine the genes that directly involve in the classification and to eliminate irrelevant genes. In this paper statistical measures like T-Statistics, Signal-to-Noise Ratio (SNR) and F-Statistics are used to rank the genes. The ranked genes are used for further classification. Particle Swarm Optimization (PSO) algorithm and Shuffled Frog Leaping (SFL) algorithm are used to find the significant genes from the top-m ranked genes. The Naïve Bayes Classifier (NBC) is used to classify the samples based on the significant genes. The proposed work is applied on Lung and Ovarian datasets. The experimental results show that the proposed method achieves 100% accuracy in all the three datasets and the results are compared with previous works.

Keywords: Microarray, T-Statistics, Signal-to-Noise Ratio, FStatistics, Particle Swarm Optimization, Shuffled Frog Leaping, Naïve Bayes Classifier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1910
680 Economical and Technical Analysis of Urban Transit System Selection Using TOPSIS Method According to Constructional and Operational Aspects

Authors: Ali Abdi Kordani, Meysam Rooyintan, Sid Mohammad Boroomandrad

Abstract:

Nowadays, one the most important problems in megacities is public transportation and satisfying citizens from this system in order to decrease the traffic congestions and air pollution. Accordingly, to improve the transit passengers and increase the travel safety, new transportation systems such as Bus Rapid Transit (BRT), tram, and monorail have expanded that each one has different merits and demerits. That is why comparing different systems for a systematic selection of public transportation systems in a big city like Tehran, which has numerous problems in terms of traffic and pollution, is essential. In this paper, it is tried to investigate the advantages and feasibility of using monorail, tram and BRT systems, which are widely used in most of megacities in all over the world. In Tehran, by using SPSS statistical analysis software and TOPSIS method, these three modes are compared to each other and their results will be assessed. Experts, who are experienced in the transportation field, answer the prepared matrix questionnaire to select each public transportation mode (tram, monorail, and BRT). The results according to experts’ judgments represent that monorail has the first priority, Tram has the second one, and BRT has the third one according to the considered indices like execution costs, wasting time, depreciation, pollution, operation costs, travel time, passenger satisfaction, benefit to cost ratio and traffic congestion.

Keywords: Bus Rapid Transit, Costs, Monorail, Pollution, Tram.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 635
679 River Stage-Discharge Forecasting Based on Multiple-Gauge Strategy Using EEMD-DWT-LSSVM Approach

Authors: Farhad Alizadeh, Alireza Faregh Gharamaleki, Mojtaba Jalilzadeh, Houshang Gholami, Ali Akhoundzadeh

Abstract:

This study presented hybrid pre-processing approach along with a conceptual model to enhance the accuracy of river discharge prediction. In order to achieve this goal, Ensemble Empirical Mode Decomposition algorithm (EEMD), Discrete Wavelet Transform (DWT) and Mutual Information (MI) were employed as a hybrid pre-processing approach conjugated to Least Square Support Vector Machine (LSSVM). A conceptual strategy namely multi-station model was developed to forecast the Souris River discharge more accurately. The strategy used herein was capable of covering uncertainties and complexities of river discharge modeling. DWT and EEMD was coupled, and the feature selection was performed for decomposed sub-series using MI to be employed in multi-station model. In the proposed feature selection method, some useless sub-series were omitted to achieve better performance. Results approved efficiency of the proposed DWT-EEMD-MI approach to improve accuracy of multi-station modeling strategies.

Keywords: River stage-discharge process, LSSVM, discrete wavelet transform (DWT), ensemble empirical decomposition mode (EEMD), multi-station modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 616
678 Georgia Case: Tourism Expenses of International Visitors on the Basis of Growing Attractiveness

Authors: Nino Abesadze, Marine Mindorashvili, Nino Paresashvili

Abstract:

At present actual tourism indicators cannot be calculated in Georgia, making it impossible to perform their quantitative analysis. Therefore, the study conducted by us is highly important from a theoretical as well as practical standpoint. The main purpose of the article is to make complex statistical analysis of tourist expenses of foreign visitors and to calculate statistical attractiveness indices of the tourism potential of Georgia. During the research, the method involving random and proportional selection has been applied. Computer software SPSS was used to compute statistical data for corresponding analysis. Corresponding methodology of tourism statistics was implemented according to international standards. Important information was collected and grouped from major Georgian airports, and a representative population of foreign visitors and a rule of selection of respondents were determined. The results show a trend of growth in tourist numbers and the share of tourists from post-soviet countries are constantly increasing. The level of satisfaction with tourist facilities and quality of service has improved, but still we have a problem of disparity between the service quality and the prices. The design of tourist expenses of foreign visitors is diverse; competitiveness of tourist products of Georgian tourist companies is higher. Attractiveness of popular cities of Georgia has increased by 43%.

Keywords: Tourist, expenses, indexes, statistics, analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 885
677 Threshold Concepts in TESOL: A Thematic Analysis of Disciplinary Guiding Principles

Authors: Neil Morgan

Abstract:

The notion of Threshold Concepts has offered a fertile new perspective on the transformative effects of mastery of particular concepts on student understanding of subject matter and their developing identities as inductees into disciplinary discourse communities. Only by successfully traversing essential knowledge thresholds can neophytes achieve the more sophisticated understandings of subject matter possessed by mature members of a discipline. This paper uses thematic analysis of disciplinary guiding principles to identify nine candidate Threshold Concepts that appear to underpin effective TESOL practice. The relationship between these candidate TESOL Threshold Concepts, TESOL principles, and TESOL instructional techniques appears to be amenable to a schematic representation based on superordinate categories of TESOL practitioner concern and, as such, offers an alternative to the view of Threshold Concepts as a privileged subset of disciplinary core concepts. The paper concludes by exploring the potential of a Threshold Concepts framework to productively inform TESOL initial teacher education (ITE) and in-service education and training (INSET).

Keywords: TESOL, threshold concepts, TESOL principles, TESOL ITE/INSET, community of practice.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 674
676 Data Mining Determination of Sunlight Average Input for Solar Power Plant

Authors: Fl. Loury, P. Sablonière, C. Lamoureux, G. Magnier, Th. Gutierrez

Abstract:

A method is proposed to extract faithful representative patterns from data set of observations when they are suffering from non-negligible fluctuations. Supposing time interval between measurements to be extremely small compared to observation time, it consists in defining first a subset of intermediate time intervals characterizing coherent behavior. Data projection on these intervals gives a set of curves out of which an ideally “perfect” one is constructed by taking the sup limit of them. Then comparison with average real curve in corresponding interval gives an efficiency parameter expressing the degradation consecutive to fluctuation effect. The method is applied to sunlight data collected in a specific place, where ideal sunlight is the one resulting from direct exposure at location latitude over the year, and efficiency is resulting from action of meteorological parameters, mainly cloudiness, at different periods of the year. The extracted information already gives interesting element of decision, before being used for analysis of plant control.

Keywords: Base Input Reconstruction, Data Mining, Efficiency Factor, Information Pattern Operator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1494
675 Automatic Detection of Defects in Ornamental Limestone Using Wavelets

Authors: Maria C. Proença, Marco Aniceto, Pedro N. Santos, José C. Freitas

Abstract:

A methodology based on wavelets is proposed for the automatic location and delimitation of defects in limestone plates. Natural defects include dark colored spots, crystal zones trapped in the stone, areas of abnormal contrast colors, cracks or fracture lines, and fossil patterns. Although some of these may or may not be considered as defects according to the intended use of the plate, the goal is to pair each stone with a map of defects that can be overlaid on a computer display. These layers of defects constitute a database that will allow the preliminary selection of matching tiles of a particular variety, with specific dimensions, for a requirement of N square meters, to be done on a desktop computer rather than by a two-hour search in the storage park, with human operators manipulating stone plates as large as 3 m x 2 m, weighing about one ton. Accident risks and work times are reduced, with a consequent increase in productivity. The base for the algorithm is wavelet decomposition executed in two instances of the original image, to detect both hypotheses – dark and clear defects. The existence and/or size of these defects are the gauge to classify the quality grade of the stone products. The tuning of parameters that are possible in the framework of the wavelets corresponds to different levels of accuracy in the drawing of the contours and selection of the defects size, which allows for the use of the map of defects to cut a selected stone into tiles with minimum waste, according the dimension of defects allowed.

Keywords: Automatic detection, wavelets, defects, fracture lines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1135
674 Robot Movement Using the Trust Region Policy Optimization

Authors: Romisaa Ali

Abstract:

The Policy Gradient approach is a subset of the Deep Reinforcement Learning (DRL) combines Deep Neural Networks (DNN) with Reinforcement Learning (RL). This approach finds the optimal policy of robot movement, based on the experience it gains from interaction with its environment. Unlike previous policy gradient algorithms, which were unable to handle the two types of error variance and bias introduced by the DNN model due to over- or underestimation, this algorithm is capable of handling both types of error variance and bias. This article will discuss the state-of-the-art SOTA policy gradient technique, trust region policy optimization (TRPO), by applying this method in various environments compared to another policy gradient method, the Proximal Policy Optimization (PPO), to explain their robust optimization, using this SOTA to gather experience data during various training phases after observing the impact of hyper-parameters on neural network performance.

Keywords: Deep neural networks, deep reinforcement learning, Proximal Policy Optimization, state-of-the-art, trust region policy optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 129
673 Implementation of a Multimodal Biometrics Recognition System with Combined Palm Print and Iris Features

Authors: Rabab M. Ramadan, Elaraby A. Elgallad

Abstract:

With extensive application, the performance of unimodal biometrics systems has to face a diversity of problems such as signal and background noise, distortion, and environment differences. Therefore, multimodal biometric systems are proposed to solve the above stated problems. This paper introduces a bimodal biometric recognition system based on the extracted features of the human palm print and iris. Palm print biometric is fairly a new evolving technology that is used to identify people by their palm features. The iris is a strong competitor together with face and fingerprints for presence in multimodal recognition systems. In this research, we introduced an algorithm to the combination of the palm and iris-extracted features using a texture-based descriptor, the Scale Invariant Feature Transform (SIFT). Since the feature sets are non-homogeneous as features of different biometric modalities are used, these features will be concatenated to form a single feature vector. Particle swarm optimization (PSO) is used as a feature selection technique to reduce the dimensionality of the feature. The proposed algorithm will be applied to the Institute of Technology of Delhi (IITD) database and its performance will be compared with various iris recognition algorithms found in the literature.

Keywords: Iris recognition, particle swarm optimization, feature extraction, feature selection, palm print, scale invariant feature transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 842
672 Weighted Data Replication Strategy for Data Grid Considering Economic Approach

Authors: N. Mansouri, A. Asadi

Abstract:

Data Grid is a geographically distributed environment that deals with data intensive application in scientific and enterprise computing. Data replication is a common method used to achieve efficient and fault-tolerant data access in Grids. In this paper, a dynamic data replication strategy, called Enhanced Latest Access Largest Weight (ELALW) is proposed. This strategy is an enhanced version of Latest Access Largest Weight strategy. However, replication should be used wisely because the storage capacity of each Grid site is limited. Thus, it is important to design an effective strategy for the replication replacement task. ELALW replaces replicas based on the number of requests in future, the size of the replica, and the number of copies of the file. It also improves access latency by selecting the best replica when various sites hold replicas. The proposed replica selection selects the best replica location from among the many replicas based on response time that can be determined by considering the data transfer time, the storage access latency, the replica requests that waiting in the storage queue and the distance between nodes. Simulation results utilizing the OptorSim show our replication strategy achieve better performance overall than other strategies in terms of job execution time, effective network usage and storage resource usage.

Keywords: Data grid, data replication, simulation, replica selection, replica placement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2073