Search results for: model selection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17897

Search results for: model selection

17597 Firm Level Productivity Heterogeneity and Export Behavior: Evidence from UK

Authors: Umut Erksan Senalp

Abstract:

The aim of this study is to examine the link between firm level productivity heterogeneity and firm’s decision to export. Thus, we test the self selection hypothesis which suggests only more productive firms self select themselves to export markets. We analyze UK manufacturing sector by using firm-level data for the period 2003-2011. Although our preliminary results suggest that exporters outperform non-exporters when we pool all manufacturing industries, when we examine each industry individually, we find that self-selection hypothesis does not hold for each industries.

Keywords: total factor productivity, firm heterogeneity, international trade, decision to export

Procedia PDF Downloads 335
17596 AI Peer Review Challenge: Standard Model of Physics vs 4D GEM EOS

Authors: David A. Harness

Abstract:

Natural evolution of ATP cognitive systems is to meet AI peer review standards. ATP process of axiom selection from Mizar to prove a conjecture would be further refined, as in all human and machine learning, by solving the real world problem of the proposed AI peer review challenge: Determine which conjecture forms the higher confidence level constructive proof between Standard Model of Physics SU(n) lattice gauge group operation vs. present non-standard 4D GEM EOS SU(n) lattice gauge group spatially extended operation in which the photon and electron are the first two trace angular momentum invariants of a gravitoelectromagnetic (GEM) energy momentum density tensor wavetrain integration spin-stress pressure-volume equation of state (EOS), initiated via 32 lines of Mathematica code. Resulting gravitoelectromagnetic spectrum ranges from compressive through rarefactive of the central cosmological constant vacuum energy density in units of pascals. Said self-adjoint group operation exclusively operates on the stress energy momentum tensor of the Einstein field equations, introducing quantization directly on the 4D spacetime level, essentially reformulating the Yang-Mills virtual superpositioned particle compounded lattice gauge groups quantization of the vacuum—into a single hyper-complex multi-valued GEM U(1) × SU(1,3) lattice gauge group Planck spacetime mesh quantization of the vacuum. Thus the Mizar corpus already contains all of the axioms required for relevant DeepMath premise selection and unambiguous formal natural language parsing in context deep learning.

Keywords: automated theorem proving, constructive quantum field theory, information theory, neural networks

Procedia PDF Downloads 149
17595 High-Throughput Screening and Selection of Electrogenic Microbial Communities Using Single Chamber Microbial Fuel Cells Based on 96-Well Plate Array

Authors: Lukasz Szydlowski, Jiri Ehlich, Igor Goryanin

Abstract:

We demonstrate a single chamber, 96-well-plated based Microbial Fuel Cell (MFC) with printed, electronic components. This invention is aimed at robust selection of electrogenic microbial community under specific conditions, e.g., electrode potential, pH, nutrient concentration, salt concentration that can be altered within the 96 well plate array. This invention enables robust selection of electrogenic microbial community under the homogeneous reactor, with multiple conditions that can be altered to allow comparative analysis. It can be used as a standalone technique or in conjunction with other selective processes, e.g., flow cytometry, microfluidic-based dielectrophoretic trapping. Mobile conductive elements, like carbon paper, carbon sponge, activated charcoal granules, metal mesh, can be inserted inside to increase the anode surface area in order to collect electrogenic microorganisms and to transfer them into new reactors or for other analytical works. An array of 96-well plate allows this device to be operated by automated pipetting stations.

Keywords: bioengineering, electrochemistry, electromicrobiology, microbial fuel cell

Procedia PDF Downloads 113
17594 Prioritization of Customer Order Selection Factors by Utilizing Conjoint Analysis: A Case Study for a Structural Steel Firm

Authors: Burcu Akyildiz, Cigdem Kadaifci, Y. Ilker Topcu, Burc Ulengin

Abstract:

In today’s business environment, companies should make strategic decisions to gain sustainable competitive advantage. Order selection is a crucial issue among these decisions especially for steel production industry. When the companies allocate a high proportion of their design and production capacities to their ongoing projects, determining which customer order should be chosen among the potential orders without exceeding the remaining capacity is the major critical problem. In this study, it is aimed to identify and prioritize the evaluation factors for the customer order selection problem. Conjoint analysis is used to examine the importance level of each factor which is determined as the potential profit rate per unit of time, the compatibility of potential order with available capacity, the level of potential future order with higher profit, customer credit of future business opportunity, and the negotiability level of production schedule for the order.

Keywords: conjoint analysis, order prioritization, profit management, structural steel firm

Procedia PDF Downloads 360
17593 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks

Authors: Wang Yichen, Haruka Yamashita

Abstract:

In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.

Keywords: recurrent neural network, players lineup, basketball data, decision making model

Procedia PDF Downloads 100
17592 Managing the Local Manager: A Comparative Study of Core HRM Functions in Multinationals

Authors: Maria Khan

Abstract:

Framing good core Human Resource Management (HRM) functions like recruitment, selection, training and development, which if executed effectively, can become a strategic advantage for a company. HRM policies related to mid-level managers can depend on the type of top management. This may be due to the difference in perception of effective HRM policies of an expatriate and local leadership. This comparative case study assesses how local mid-level managers are managed in leading multinational telecom companies in Pakistan. Core HRM functions related to managers were analysed through field research based on semi-structured interviews with relevant Human Resource Managers. Results suggest that recruitment and selection practices are not too different and are in compliance with best HRM practices. However, there is a difference in the effective implementation of Training and Development policies. Changing global management trends and skill development dictate that MNCs continuously develop the local talent effectively for local and international success.

Keywords: recruitment, selection, training, development, core HRM, human resource management, subsidiary, international staffing, managers, MNC, expatriate

Procedia PDF Downloads 288
17591 The Use of Psychological Tests in Polish Organizations - Empirical Evidence

Authors: Milena Gojny-Zbierowska

Abstract:

In the last decades psychological tests have been gaining in popularity as a method used for evaluating personnel, and they bring consulting companies solid profits rising by up to 10% each year. The market is offering a growing range of tools for the assessment of personality. Tests are used in organizations mainly in the recruitment and selection of staff. This paper is an attempt to initially diagnose the state of the use of psychological tests in Polish companies on the basis of empirical research.

Keywords: psychological tests, personality, content analysis, NEO FFI, big five personality model

Procedia PDF Downloads 316
17590 Enhanced Extra Trees Classifier for Epileptic Seizure Prediction

Authors: Maurice Ntahobari, Levin Kuhlmann, Mario Boley, Zhinoos Razavi Hesabi

Abstract:

For machine learning based epileptic seizure prediction, it is important for the model to be implemented in small implantable or wearable devices that can be used to monitor epilepsy patients; however, current state-of-the-art methods are complex and computationally intensive. We use Shapley Additive Explanation (SHAP) to find relevant intracranial electroencephalogram (iEEG) features and improve the computational efficiency of a state-of-the-art seizure prediction method based on the extra trees classifier while maintaining prediction performance. Results for a small contest dataset and a much larger dataset with continuous recordings of up to 3 years per patient from 15 patients yield better than chance prediction performance (p < 0.004). Moreover, while the performance of the SHAP-based model is comparable to that of the benchmark, the overall training and prediction time of the model has been reduced by a factor of 1.83. It can also be noted that the feature called zero crossing value is the best EEG feature for seizure prediction. These results suggest state-of-the-art seizure prediction performance can be achieved using efficient methods based on optimal feature selection.

Keywords: machine learning, seizure prediction, extra tree classifier, SHAP, epilepsy

Procedia PDF Downloads 79
17589 An Efficient Strategy for Relay Selection in Multi-Hop Communication

Authors: Jung-In Baik, Seung-Jun Yu, Young-Min Ko, Hyoung-Kyu Song

Abstract:

This paper proposes an efficient relaying algorithm to obtain diversity for improving the reliability of a signal. The algorithm achieves time or space diversity gain by multiple versions of the same signal through two routes. Relays are separated between a source and destination. The routes between the source and destination are set adaptive in order to deal with different channels and noises. The routes consist of one or more relays and the source transmits its signal to the destination through the routes. The signals from the relays are combined and detected at the destination. The proposed algorithm provides a better performance than the conventional algorithms in bit error rate (BER).

Keywords: multi-hop, OFDM, relay, relaying selection

Procedia PDF Downloads 417
17588 RAD-Seq Data Reveals Evidence of Local Adaptation between Upstream and Downstream Populations of Australian Glass Shrimp

Authors: Sharmeen Rahman, Daniel Schmidt, Jane Hughes

Abstract:

Paratya australiensis Kemp (Decapoda: Atyidae) is a widely distributed indigenous freshwater shrimp, highly abundant in eastern Australia. This species has been considered as a model stream organism to study genetics, dispersal, biology, behaviour and evolution in Atyids. Paratya has a filter feeding and scavenging habit which plays a significant role in the formation of lotic community structure. It has been shown to reduce periphyton and sediment from hard substrates of coastal streams and hence acts as a strongly-interacting ecosystem macroconsumer. Besides, Paratya is one of the major food sources for stream dwelling fishes. Paratya australiensis is a cryptic species complex consisting of 9 highly divergent mitochondrial DNA lineages. Among them, one lineage has been observed to favour upstream sites at higher altitudes, with cooler water temperatures. This study aims to identify local adaptation in upstream and downstream populations of this lineage in three streams in the Conondale Range, North-eastern Brisbane, Queensland, Australia. Two populations (up and down stream) from each stream have been chosen to test for local adaptation, and a parallel pattern of adaptation is expected across all streams. Six populations each consisting of 24 individuals were sequenced using the Restriction Site Associated DNA-seq (RAD-seq) technique. Genetic markers (SNPs) were developed using double digest RAD sequencing (ddRAD-seq). These were used for de novo assembly of Paratya genome. De novo assembly was done using the STACKs program and produced 56, 344 loci for 47 individuals from one stream. Among these individuals, 39 individuals shared 5819 loci, and these markers are being used to test for local adaptation using Fst outlier tests (Arlequin) and Bayesian analysis (BayeScan) between up and downstream populations. Fst outlier test detected 27 loci likely to be under selection and the Bayesian analysis also detected 27 loci as under selection. Among these 27 loci, 3 loci showed evidence of selection at a significance level using BayeScan program. On the other hand, up and downstream populations are strongly diverged at neutral loci with a Fst =0.37. Similar analysis will be done with all six populations to determine if there is a parallel pattern of adaptation across all streams. Furthermore, multi-locus among population covariance analysis will be done to identify potential markers under selection as well as to compare single locus versus multi-locus approaches for detecting local adaptation. Adaptive genes identified in this study can be used for future studies to design primers and test for adaptation in related crustacean species.

Keywords: Paratya australiensis, rainforest streams, selection, single nucleotide polymorphism (SNPs)

Procedia PDF Downloads 226
17587 Evidence of Natural Selection Footprints among Some African Chicken Breeds and Village Ecotypes

Authors: Ahmed Elbeltagy, Francesca Bertolini, Damarius Fleming, Angelica Van Goor, Chris Ashwell, Carl Schmidt, Donald Kugonza, Susan Lamont, Max Rothschild

Abstract:

The major factor in shaping genomic variation of the African indigenous rural chicken is likely natural selection drives the development genetic footprints in the chicken genomes. To investigate such a hypothesis of a selection footprint, a total of 292 birds were randomly sampled from three indigenous ecotypes from East Africa (Uganda, Rwanda) and North Africa (Egypt) and two registered Egyptian breeds (Fayoumi and Dandarawi), and from the synthetic Kuroiler breed. Samples were genotyped using the Affymetrix 600K Axiom® Array. A total of 526,652 SNPs were utilized in the downstream analysis after quality control measures. The intra-population runs of homozygosity (ROH) that were consensuses in > 50% of individuals of an ecotype or > 75% of a breed were studied. To identify inter-population differentiation due to genetic structure, FST was calculated for North- vs. East- African populations in addition to population-pairwise combinations for overlapping windows (500Kb with an overlap of 250Kb). A total of 28,563 ROH were determined and were classified into three length categories. ROH and Fst detected sweeps were identified on several autosomes. Several genes in these regions are likely to be related to adaptation to local environmental stresses that include high altitude, diseases resistance, poor nutrition, oxidative and heat stresses and were linked to gene ontology terms (GO) related to immune response, oxygen consumption and heme binding, carbohydrate metabolism, oxidation-reduction, and behavior. Results indicated a possible effect of natural selection forces on shaping genomic structure for adaptation to local environmental stresses.

Keywords: African Chicken, runs of homozygosity, FST, selection footprints

Procedia PDF Downloads 291
17586 Cloud Shield: Model to Secure User Data While Using Content Delivery Network Services

Authors: Rachna Jain, Sushila Madan, Bindu Garg

Abstract:

Cloud computing is the key powerhouse in numerous organizations due to shifting of their data to the cloud environment. In recent years it has been observed that cloud-based-services are being used on large scale for content storage, distribution and processing. Various issues have been observed in cloud computing environment that need to be addressed. Security and privacy are found topmost concern area. In this paper, a novel security model is proposed to secure data by utilizing CDN services like image to icon conversion. CDN Service is a content delivery service which converts an image to icon, word to pdf & Latex to pdf etc. Presented model is used to convert an image into icon by keeping image secret. Here security of image is imparted so that image should be encrypted and decrypted by data owners only. It is also discussed in the paper that how server performs multiplication and selection on encrypted data without decryption. The data can be image file, word file, audio or video file. Moreover, the proposed model is capable enough to multiply images, encrypt them and send to a server application for conversion. Eventually, the prime objective is to encrypt an image and convert the encrypted image to image Icon by utilizing homomorphic encryption.

Keywords: cloud computing, user data security, homomorphic encryption, image multiplication, CDN service

Procedia PDF Downloads 314
17585 Implementation and Validation of a Damage-Friction Constitutive Model for Concrete

Authors: L. Madouni, M. Ould Ouali, N. E. Hannachi

Abstract:

Two constitutive models for concrete are available in ABAQUS/Explicit, the Brittle Cracking Model and the Concrete Damaged Plasticity Model, and their suitability and limitations are well known. The aim of the present paper is to implement a damage-friction concrete constitutive model and to evaluate the performance of this model by comparing the predicted response with experimental data. The constitutive formulation of this material model is reviewed. In order to have consistent results, the parameter identification and calibration for the model have been performed. Several numerical simulations are presented in this paper, whose results allow for validating the capability of the proposed model for reproducing the typical nonlinear performances of concrete structures under different monotonic and cyclic load conditions. The results of the evaluation will be used for recommendations concerning the application and further improvements of the investigated model.

Keywords: Abaqus, concrete, constitutive model, numerical simulation

Procedia PDF Downloads 330
17584 Assessing Level of Pregnancy Rate and Milk Yield in Indian Murrah Buffaloes

Authors: V. Jamuna, A. K. Chakravarty, C. S. Patil, Vijay Kumar, M. A. Mir, Rakesh Kumar

Abstract:

Intense selection of buffaloes for milk production at organized herds of the country without giving due attention to fertility traits viz. pregnancy rate has lead to deterioration in their performances. Aim of study is to develop an optimum model for predicting pregnancy rate and to assess the level of pregnancy rate with respect to milk production Murrah buffaloes. Data pertaining to 1224 lactation records of Murrah buffaloes spread over a period 21 years were analyzed and it was observed that pregnancy rate depicted negative phenotypic association with lactation milk yield (-0.08 ± 0.04). For developing optimum model for pregnancy rate in Murrah buffaloes seven simple and multiple regression models were developed. Among the seven models, model II having only Service period as an independent reproduction variable, was found to be the best prediction model, based on the four statistical criterions (high coefficient of determination (R 2), low mean sum of squares due to error (MSSe), conceptual predictive (CP) value, and Bayesian information criterion (BIC). For standardizing the level of fertility with milk production, pregnancy rate was classified into seven classes with the increment of 10% in all parities, life time and their corresponding average pregnancy rate in relation to the average lactation milk yield (MY).It was observed that to achieve around 2000 kg MY which can be considered optimum for Indian Murrah buffaloes, level of pregnancy rate should be in between 30-50%.

Keywords: life time, pregnancy rate, production, service period, standardization

Procedia PDF Downloads 599
17583 3D Reconstruction of Human Body Based on Gender Classification

Authors: Jiahe Liu, Hongyang Yu, Feng Qian, Miao Luo

Abstract:

SMPL-X was a powerful parametric human body model that included male, neutral, and female models, with significant gender differences between these three models. During the process of 3D human body reconstruction, the correct selection of standard templates was crucial for obtaining accurate results. To address this issue, we developed an efficient gender classification algorithm to automatically select the appropriate template for 3D human body reconstruction. The key to this gender classification algorithm was the precise analysis of human body features. By using the SMPL-X model, the algorithm could detect and identify gender features of the human body, thereby determining which standard template should be used. The accuracy of this algorithm made the 3D reconstruction process more accurate and reliable, as it could adjust model parameters based on individual gender differences. SMPL-X and the related gender classification algorithm have brought important advancements to the field of 3D human body reconstruction. By accurately selecting standard templates, they have improved the accuracy of reconstruction and have broad potential in various application fields. These technologies continue to drive the development of the 3D reconstruction field, providing us with more realistic and accurate human body models.

Keywords: gender classification, joint detection, SMPL-X, 3D reconstruction

Procedia PDF Downloads 36
17582 The Importance of Downstream Supply Chain in Supply Chain Risk Management: Multi-Objective Optimization

Authors: Zohreh Khojasteh-Ghamari, Takashi Irohara

Abstract:

One of the efficient ways in supply chain risk management is avoiding the interruption in Supply Chain (SC) before it occurs. Although the majority of the organizations focus on their first-tier suppliers to avoid risk in the SC, studies show that in only 60 percent of the disruption cases the reason is first tier suppliers. In the 40 percent of the SC disruptions, the reason is downstream SC, which is the second tier and lower. Due to the increasing complexity and interrelation of modern supply chains, the SC elements have become difficult to trace. Moreover, studies show that there is a vital need to better understand the integration of risk and visibility, especially in the context of multiple objectives. In this study, we propose a multi-objective programming model to avoid disruption in SC. The objective of this study is evaluating the effect of downstream SCV on managing supply chain risk. We propose a multi-objective mathematical programming model with the objective functions of minimizing the total cost and maximizing the downstream supply chain visibility (SCV). The decision variable is supplier selection. We assume there are several manufacturers and several candidate suppliers. For each manufacturer, our model proposes the best suppliers with the lowest cost and maximum visibility in downstream supply chain. We examine the applicability of the model by numerical examples. We also define several scenarios for datasets and observe the tendency. The results show that minimum visibility in downstream SC is needed to have a safe SC network.

Keywords: downstream supply chain, optimization, supply chain risk, supply chain visibility

Procedia PDF Downloads 218
17581 Multi-Class Text Classification Using Ensembles of Classifiers

Authors: Syed Basit Ali Shah Bukhari, Yan Qiang, Saad Abdul Rauf, Syed Saqlaina Bukhari

Abstract:

Text Classification is the methodology to classify any given text into the respective category from a given set of categories. It is highly important and vital to use proper set of pre-processing , feature selection and classification techniques to achieve this purpose. In this paper we have used different ensemble techniques along with variance in feature selection parameters to see the change in overall accuracy of the result and also on some other individual class based features which include precision value of each individual category of the text. After subjecting our data through pre-processing and feature selection techniques , different individual classifiers were tested first and after that classifiers were combined to form ensembles to increase their accuracy. Later we also studied the impact of decreasing the classification categories on over all accuracy of data. Text classification is highly used in sentiment analysis on social media sites such as twitter for realizing people’s opinions about any cause or it is also used to analyze customer’s reviews about certain products or services. Opinion mining is a vital task in data mining and text categorization is a back-bone to opinion mining.

Keywords: Natural Language Processing, Ensemble Classifier, Bagging Classifier, AdaBoost

Procedia PDF Downloads 204
17580 Integrated Approach of Quality Function Deployment, Sensitivity Analysis and Multi-Objective Linear Programming for Business and Supply Chain Programs Selection

Authors: T. T. Tham

Abstract:

The aim of this study is to propose an integrated approach to determine the most suitable programs, based on Quality Function Deployment (QFD), Sensitivity Analysis (SA) and Multi-Objective Linear Programming model (MOLP). Firstly, QFD is used to determine business requirements and transform them into business and supply chain programs. From the QFD, technical scores of all programs are obtained. All programs are then evaluated through five criteria (productivity, quality, cost, technical score, and feasibility). Sets of weight of these criteria are built using Sensitivity Analysis. Multi-Objective Linear Programming model is applied to select suitable programs according to multiple conflicting objectives under a budget constraint. A case study from the Sai Gon-Mien Tay Beer Company is given to illustrate the proposed methodology. The outcome of the study provides a comprehensive picture for companies to select suitable programs to obtain the optimal solution according to their preference.

Keywords: business program, multi-objective linear programming model, quality function deployment, sensitivity analysis, supply chain management

Procedia PDF Downloads 88
17579 Do Career Expectancy Beliefs Foster Stability as Well as Mobility in One's Career? A Conceptual Model

Authors: Bishakha Majumdar, Ranjeet Nambudiri

Abstract:

Considerable dichotomy exists in research regarding the role of optimism and self-efficacy in work and career outcomes. Optimism and self-efficacy are related to performance, commitment and engagement, but also are implicated in seeing opportunities outside the firm and switching jobs. There is absence of research capturing these opposing strands of findings in the same model and providing a holistic understanding of how the expectancy beliefs operate in case of the working professional. We attempt to bridge this gap by proposing that career-decision self-efficacy and career outcome expectations affect intention to quit through the competitive mediation pathways of internal and external marketability. This model provides a holistic picture of the role of career expectancy beliefs on career outcomes, by considering perceived career opportunities both inside and outside one’s present organization. The understanding extends the application of career expectancy beliefs in the context of career decision-making by the employed individual. Further, it is valuable for reconsidering the effectiveness of hiring and retention techniques used by a firm, as selection, rewards and training programs need to be supplemented by interventions that specifically strengthen the stability pathway.

Keywords: career decision self-efficacy, career outcome expectations, marketability, intention to quit, job mobility

Procedia PDF Downloads 601
17578 Metaheuristic to Align Multiple Sequences

Authors: Lamiche Chaabane

Abstract:

In this study, a new method for solving sequence alignment problem is proposed, which is named ITS (Improved Tabu Search). This algorithm is based on the classical Tabu Search (TS). ITS is implemented in order to obtain results of multiple sequence alignment. Several ideas concerning neighbourhood generation, move selection mechanisms and intensification/diversification strategies for our proposed ITS is investigated. ITS have generated high-quality results in terms of measure of scores in comparison with the classical TS and simple iterative search algorithm.

Keywords: multiple sequence alignment, tabu search, improved tabu search, neighbourhood generation, selection mechanisms

Procedia PDF Downloads 268
17577 Criterion-Referenced Test Reliability through Threshold Loss Agreement: Fuzzy Logic Analysis Approach

Authors: Mohammad Ali Alavidoost, Hossein Bozorgian

Abstract:

Criterion-referenced tests (CRTs) are designed to measure student performance against a fixed set of predetermined criteria or learning standards. The reliability of such tests cannot be based on internal reliability. Threshold loss agreement is one way to calculate the reliability of CRTs. However, the selection of master and non-master in such agreement is determined by the threshold point. The problem is if the threshold point witnesses a minute change, the selection of master and non-master may have a drastic change, leading to the change in reliability results. Therefore, in this study, the Fuzzy logic approach is employed as a remedial procedure for data analysis to obviate the threshold point problem. Forty-one Iranian students were selected; the participants were all between 20 and 30 years old. A quantitative approach was used to address the research questions. In doing so, a quasi-experimental design was utilized since the selection of the participants was not randomized. Based on the Fuzzy logic approach, the threshold point would be more stable during the analysis, resulting in rather constant reliability results and more precise assessment.

Keywords: criterion-referenced tests, threshold loss agreement, threshold point, fuzzy logic approach

Procedia PDF Downloads 330
17576 A New Method Separating Relevant Features from Irrelevant Ones Using Fuzzy and OWA Operator Techniques

Authors: Imed Feki, Faouzi Msahli

Abstract:

Selection of relevant parameters from a high dimensional process operation setting space is a problem frequently encountered in industrial process modelling. This paper presents a method for selecting the most relevant fabric physical parameters for each sensory quality feature. The proposed relevancy criterion has been developed using two approaches. The first utilizes a fuzzy sensitivity criterion by exploiting from experimental data the relationship between physical parameters and all the sensory quality features for each evaluator. Next an OWA aggregation procedure is applied to aggregate the ranking lists provided by different evaluators. In the second approach, another panel of experts provides their ranking lists of physical features according to their professional knowledge. Also by applying OWA and a fuzzy aggregation model, the data sensitivity-based ranking list and the knowledge-based ranking list are combined using our proposed percolation technique, to determine the final ranking list. The key issue of the proposed percolation technique is to filter automatically and objectively the relevant features by creating a gap between scores of relevant and irrelevant parameters. It permits to automatically generate threshold that can effectively reduce human subjectivity and arbitrariness when manually choosing thresholds. For a specific sensory descriptor, the threshold is defined systematically by iteratively aggregating (n times) the ranking lists generated by OWA and fuzzy models, according to a specific algorithm. Having applied the percolation technique on a real example, of a well known finished textile product especially the stonewashed denims, usually considered as the most important quality criteria in jeans’ evaluation, we separate the relevant physical features from irrelevant ones for each sensory descriptor. The originality and performance of the proposed relevant feature selection method can be shown by the variability in the number of physical features in the set of selected relevant parameters. Instead of selecting identical numbers of features with a predefined threshold, the proposed method can be adapted to the specific natures of the complex relations between sensory descriptors and physical features, in order to propose lists of relevant features of different sizes for different descriptors. In order to obtain more reliable results for selection of relevant physical features, the percolation technique has been applied for combining the fuzzy global relevancy and OWA global relevancy criteria in order to clearly distinguish scores of the relevant physical features from those of irrelevant ones.

Keywords: data sensitivity, feature selection, fuzzy logic, OWA operators, percolation technique

Procedia PDF Downloads 569
17575 The Discussion on the Composition of Feng Shui by the Environmental Planning Viewpoint

Authors: Jhuang Jin-Jhong, Hsieh Wei-Fan

Abstract:

Climate change causes natural disasters persistently. Therefore, nowadays environmental planning objective tends to the issues of respecting nature and coexisting with nature. As a result, the natural environment analysis, e.g., the analysis of topography, soil, hydrology, climate, vegetation, is highly emphasized. On the other hand, Feng Shui has been a criterion of site selection for residence in Eastern since the ancient times and has had farther influence on site selection for castles and even for temples and tombs. The primary criterion of site selection is judging the quality of Long: mountain range, Sha: nearby mountains, Shui: hydrology, Xue: foundation, Xiang: aspect, which are similar to the environmental variables of mountain range, topography, hydrology and aspect. For the reason, a lot researchers attempt to probe into the connection between the criterion of Feng Shui and environmental planning factors. Most researches only discussed with the composition and theory of space of Feng Shui, but there is no research which explained Feng Shui through the environmental field. Consequently, this study reviewed the theory of Feng Shui through the environmental planning viewpoint and assembled essential composition factors of Feng Shui. The results of this study point. From literature review and comparison of theoretical meanings, we find that the ideal principles for planning the Feng Shui environment can also be used for environmental planning. Therefore, this article uses 12 ideal environmental features used in Feng Shui to contrast the natural aspects of the environment and make comparisons with previous research and classifies the environmental factors into climate, topography, hydrology, vegetation, and soil.

Keywords: the composition of Feng Shui, environmental planning, site selection, main components of the Feng Shui environment

Procedia PDF Downloads 483
17574 Investment Decision among Public Sector Retirees: A Behavioural Finance View

Authors: Bisi S. Olawoyin

Abstract:

This study attempts an exploration into behavioural finance in which the traditional assumptions of expected utility maximization with rational investors in efficient markets are dropped. It reviews prior research and evidence about how psychological biases affect investors behaviour and stock selection. This study examined the relationship between demographic variables and financial behaviour biases among public sector retirees who invested in the Nigerian Stock Exchange prior to their retirement. By using questionnaire survey method, a total of 214 valid convenient samples were collected in order to determine how specific demographic and psychological trait affect stock selection between dividend paying and non-dividend paying stocks. Descriptive statistics and OLS were used to analyse the results. Findings showed that most of the retirees prefer dividend paying stocks in few years preceding their retirement but still hold on to their non-dividend paying stock on retirement. A significant difference also exists between senior and junior retirees in preference for non-dividend paying stocks. These findings are consistent with the clientele theories of dividend.

Keywords: behavioural finance, clientele theories, dividend paying stocks, stock selection

Procedia PDF Downloads 111
17573 Fuzzy Population-Based Meta-Heuristic Approaches for Attribute Reduction in Rough Set Theory

Authors: Mafarja Majdi, Salwani Abdullah, Najmeh S. Jaddi

Abstract:

One of the global combinatorial optimization problems in machine learning is feature selection. It concerned with removing the irrelevant, noisy, and redundant data, along with keeping the original meaning of the original data. Attribute reduction in rough set theory is an important feature selection method. Since attribute reduction is an NP-hard problem, it is necessary to investigate fast and effective approximate algorithms. In this paper, we proposed two feature selection mechanisms based on memetic algorithms (MAs) which combine the genetic algorithm with a fuzzy record to record travel algorithm and a fuzzy controlled great deluge algorithm to identify a good balance between local search and genetic search. In order to verify the proposed approaches, numerical experiments are carried out on thirteen datasets. The results show that the MAs approaches are efficient in solving attribute reduction problems when compared with other meta-heuristic approaches.

Keywords: rough set theory, attribute reduction, fuzzy logic, memetic algorithms, record to record algorithm, great deluge algorithm

Procedia PDF Downloads 421
17572 Model Driven Architecture Methodologies: A Review

Authors: Arslan Murtaza

Abstract:

Model Driven Architecture (MDA) is technique presented by OMG (Object Management Group) for software development in which different models are proposed and converted them into code. The main plan is to identify task by using PIM (Platform Independent Model) and transform it into PSM (Platform Specific Model) and then converted into code. In this review paper describes some challenges and issues that are faced in MDA, type and transformation of models (e.g. CIM, PIM and PSM), and evaluation of MDA-based methodologies.

Keywords: OMG, model driven rrchitecture (MDA), computation independent model (CIM), platform independent model (PIM), platform specific model(PSM), MDA-based methodologies

Procedia PDF Downloads 414
17571 Explainable Graph Attention Networks

Authors: David Pham, Yongfeng Zhang

Abstract:

Graphs are an important structure for data storage and computation. Recent years have seen the success of deep learning on graphs such as Graph Neural Networks (GNN) on various data mining and machine learning tasks. However, most of the deep learning models on graphs cannot easily explain their predictions and are thus often labelled as “black boxes.” For example, Graph Attention Network (GAT) is a frequently used GNN architecture, which adopts an attention mechanism to carefully select the neighborhood nodes for message passing and aggregation. However, it is difficult to explain why certain neighbors are selected while others are not and how the selected neighbors contribute to the final classification result. In this paper, we present a graph learning model called Explainable Graph Attention Network (XGAT), which integrates graph attention modeling and explainability. We use a single model to target both the accuracy and explainability of problem spaces and show that in the context of graph attention modeling, we can design a unified neighborhood selection strategy that selects appropriate neighbor nodes for both better accuracy and enhanced explainability. To justify this, we conduct extensive experiments to better understand the behavior of our model under different conditions and show an increase in both accuracy and explainability.

Keywords: explainable AI, graph attention network, graph neural network, node classification

Procedia PDF Downloads 137
17570 Random Forest Classification for Population Segmentation

Authors: Regina Chua

Abstract:

To reduce the costs of re-fielding a large survey, a Random Forest classifier was applied to measure the accuracy of classifying individuals into their assigned segments with the fewest possible questions. Given a long survey, one needed to determine the most predictive ten or fewer questions that would accurately assign new individuals to custom segments. Furthermore, the solution needed to be quick in its classification and usable in non-Python environments. In this paper, a supervised Random Forest classifier was modeled on a dataset with 7,000 individuals, 60 questions, and 254 features. The Random Forest consisted of an iterative collection of individual decision trees that result in a predicted segment with robust precision and recall scores compared to a single tree. A random 70-30 stratified sampling for training the algorithm was used, and accuracy trade-offs at different depths for each segment were identified. Ultimately, the Random Forest classifier performed at 87% accuracy at a depth of 10 with 20 instead of 254 features and 10 instead of 60 questions. With an acceptable accuracy in prioritizing feature selection, new tools were developed for non-Python environments: a worksheet with a formulaic version of the algorithm and an embedded function to predict the segment of an individual in real-time. Random Forest was determined to be an optimal classification model by its feature selection, performance, processing speed, and flexible application in other environments.

Keywords: machine learning, supervised learning, data science, random forest, classification, prediction, predictive modeling

Procedia PDF Downloads 68
17569 Importance of Location Selection of an Energy Storage System in a Smart Grid

Authors: Vanaja Rao

Abstract:

In the recent times, the need for the integration of Renewable Energy Sources (RES) in a Smart Grid is on the rise. As a result of this, associated energy storage systems are known to play important roles in sustaining the efficient operation of such RES like wind power and solar power. This paper investigates the importance of location selection of Energy Storage Systems (ESSs) in a Smart Grid. Three scenarios of ESS location is studied and analyzed in a Smart Grid, which are – 1. Near the generation/source, 2. In the middle of the Grid and, 3. Near the demand/consumption. This is explained with the aim of assisting any Distribution Network Operator (DNO) in deploying the ESSs in a power network, which will significantly help reduce the costs and time of planning and avoid any damages incurred as a result of installing them at an incorrect location of a Smart Grid. To do this, the outlined scenarios mentioned above are modelled and analyzed with the National Grid’s datasets of energy generation and consumption in the UK power network. As a result, the outcome of this analysis aims to provide a better overview for the location selection of the ESSs in a Smart Grid. This ensures power system stability and security along with the optimum usage of the ESSs.

Keywords: distribution networks, energy storage system, energy security, location planning, power stability, smart grid

Procedia PDF Downloads 273
17568 Automation of Embodied Energy Calculations for Buildings through Building Information Modelling

Authors: Ahmad Odeh

Abstract:

Researchers are currently more concerned about the calculations of energy at the operational stage, mainly due to its larger environmental impact, but the fact remains, embodied energies represent a substantial contributor unaccounted for in the overall energy computation method. The calculation of materials’ embodied energy during the construction stage is complicated. This is due to the various factors involved. The equipment used, fuel needed, and electricity required for each type of materials varies with location and thus the embodied energy will differ for each project. Moreover, the method used in manufacturing, transporting and putting in place will have significant influence on the materials’ embodied energy. This anomaly has made it difficult to calculate or even bench mark the usage of such energies. This paper presents a model aimed at calculating embodied energies based on such variabilities. It presents a systematic approach that uses an efficient method of calculation to provide a new insight for the selection of construction materials. The model is developed in a BIM environment. The quantification of materials’ energy is determined over the three main stages of their lifecycle: manufacturing, transporting and placing. The model uses three major databases each of which contains set of the construction materials that are most commonly used in building projects. The first dataset holds information about the energy required to manufacture any type of materials, the second includes information about the energy required for transporting the materials while the third stores information about the energy required by machinery to place the materials in their intended locations. Through geospatial data analysis, the model automatically calculates the distances between the suppliers and construction sites and then uses dataset information for energy computations. The computational sum of all the energies is automatically calculated and then the model provides designers with a list of usable equipment along with the associated embodied energies.

Keywords: BIM, lifecycle energy assessment, building automation, energy conservation

Procedia PDF Downloads 171