Search results for: representation selection
3261 Estimation of Fragility Curves Using Proposed Ground Motion Selection and Scaling Procedure
Authors: Esra Zengin, Sinan Akkar
Abstract:
Reliable and accurate prediction of nonlinear structural response requires specification of appropriate earthquake ground motions to be used in nonlinear time history analysis. The current research has mainly focused on selection and manipulation of real earthquake records that can be seen as the most critical step in the performance based seismic design and assessment of the structures. Utilizing amplitude scaled ground motions that matches with the target spectra is commonly used technique for the estimation of nonlinear structural response. Representative ground motion ensembles are selected to match target spectrum such as scenario-based spectrum derived from ground motion prediction equations, Uniform Hazard Spectrum (UHS), Conditional Mean Spectrum (CMS) or Conditional Spectrum (CS). Different sets of criteria exist among those developed methodologies to select and scale ground motions with the objective of obtaining robust estimation of the structural performance. This study presents ground motion selection and scaling procedure that considers the spectral variability at target demand with the level of ground motion dispersion. The proposed methodology provides a set of ground motions whose response spectra match target median and corresponding variance within a specified period interval. The efficient and simple algorithm is used to assemble the ground motion sets. The scaling stage is based on the minimization of the error between scaled median and the target spectra where the dispersion of the earthquake shaking is preserved along the period interval. The impact of the spectral variability on nonlinear response distribution is investigated at the level of inelastic single degree of freedom systems. In order to see the effect of different selection and scaling methodologies on fragility curve estimations, results are compared with those obtained by CMS-based scaling methodology. The variability in fragility curves due to the consideration of dispersion in ground motion selection process is also examined.Keywords: ground motion selection, scaling, uncertainty, fragility curve
Procedia PDF Downloads 5833260 Awning: An Unsung Trait in Rice (Oryza Sativa L.)
Authors: Chamin Chimyang
Abstract:
The fast-changing global trend and declining forest region have impacted agricultural lands; animals, especially birds, might become one of the major pests in the near future and go neglected or unreported in many kinds of literature and events, which is mainly because of bird infestation being a pocket-zone problem. This bird infestation can be attributed to the balding of the forest region and the decline in their foraging hotspot due to anthropogenic activity. There are many ways to keep away the birds from agricultural fields, both conventional and non-conventional. But the question here is whether the traditional approach of bird scarring methods such as scare-crows are effective enough. There are many traits in rice that are supposed to keep the birds away from foraging in paddy fields, and the selection of such traits might be rewarding, such as the angle of the flag leaf from the stem, grain size, novelty of any trait in that particular region and also an awning. Awning, as such, is a very particular trait on which negative selection was imposed to such an extent that there has been a decline in the nucleotide responsible for the said trait. Thus, in this particular session, histology, genetics, genes behind the trait and how awns might be one of the solutions to the problem stated above will be discussed in detail.Keywords: bird infestation, awning, negative selection, domestication
Procedia PDF Downloads 253259 Comparative Analysis of Decentralized Financial Education Systems: Lessons From Global Implementations
Authors: Flex Anim
Abstract:
The financial system is a decentralized studies system that was put into place in Ghana as a grassroots financial studies approach. Its main goal is to give people the precise knowledge, abilities, and training required for a given trade, business, profession, or occupation. In this essay, the question of how the financial studies system's devolution to local businesses results in responsible and responsive representation as well as long-term company learning is raised. It centers on two case studies, Asekwa Municipal and Oforikrom. The next question posed by the study is how senior high school students are rebuilding their livelihoods and socioeconomic well-being by creating new curriculum and social practices related to the finance and business studies system. The paper here concentrates on Kumasi District and makes inferences for the other two examples. The paper demonstrates how the financial studies system's establishment of representative groups creates the democratic space required for the successful representation of community goals. Nonetheless, the interests of a privileged few are advanced as a result of elite capture. The state's financial and business training programs do not adhere to the financial studies system's established policy procedures and do not transfer pertinent and discretionary resources to local educators. As a result, local educators are unable to encourage representation that is accountable and responsive. The financial studies system continues to pique the interest of rural areas, but this desire is skewed toward getting access to financial or business training institutions for higher education. Since the locals are not actively involved in financial education, the financial studies system serves just to advance the interests of specific populations. This article explains how rhetoric and personal benefits can be supported by the public even in the case of "failed" interventions.Keywords: financial studies system, financial studies' devolution, local government, senior high schools and financial education, as well as community goals and representation
Procedia PDF Downloads 733258 A New Framework for ECG Signal Modeling and Compression Based on Compressed Sensing Theory
Authors: Siavash Eftekharifar, Tohid Yousefi Rezaii, Mahdi Shamsi
Abstract:
The purpose of this paper is to exploit compressed sensing (CS) method in order to model and compress the electrocardiogram (ECG) signals at a high compression ratio. In order to obtain a sparse representation of the ECG signals, first a suitable basis matrix with Gaussian kernels, which are shown to nicely fit the ECG signals, is constructed. Then the sparse model is extracted by applying some optimization technique. Finally, the CS theory is utilized to obtain a compressed version of the sparse signal. Reconstruction of the ECG signal from the compressed version is also done to prove the reliability of the algorithm. At this stage, a greedy optimization technique is used to reconstruct the ECG signal and the Mean Square Error (MSE) is calculated to evaluate the precision of the proposed compression method.Keywords: compressed sensing, ECG compression, Gaussian kernel, sparse representation
Procedia PDF Downloads 4623257 EFL Vocabulary Learning Strategies among Students in Greece, Their Preferences and Internet Technology
Authors: Theodorou Kyriaki, Ypsilantis George
Abstract:
Vocabulary learning has attracted a lot of attention in recent years, contrary to the neglected part of the past. Along with the interest in finding successful vocabulary teaching strategies, many scholars focused on locating learning strategies used by language learners. As a result, more and more studies in the area of language pedagogy have been investigating the use of strategies in vocabulary learning by different types of learners. A common instrument in this field is the questionnaire, a tool of work that was enriched by questions involving current technology, and it was further implemented to a sample of 300 Greek students whose age varied from 9 and 17 years. Strategies located were grouped into the three categories of memory, cognitive, and compensatory type and associations between these dependent variables were investigated. In addition, relations between dependent and independent variables (such as age, sex, type of school, cultural background, and grade in English) were pursued to investigate the impact on strategy selection. Finally, results were compared to findings of other studies in the same field to contribute to a hypothesis of ethnic differences in strategy selection. Results initially discuss preferred strategies of all participants and further indicate that: a) technology affects strategy selection while b) differences between ethnic groups are not statistically significant. A number of successful strategies are presented, resulting from correlations of strategy selection and final school grade in English.Keywords: acquisition of English, internet technology, research among Greek students, vocabulary learning strategies
Procedia PDF Downloads 5103256 Evaluating the Role of Cinema in the Formation of Cultural Schemas of Iranian Families by Studying the Opinions of Critics at the Venice Film Festival
Authors: Elahe Zavareian
Abstract:
Cinema is a powerful medium that can depict and critique sociological and cultural issues, contributing to the expansion of important societal issues and raising awareness. Family crises and challenges are significant concerns faced by societies worldwide. The family serves as the central core for societal formation, and the challenges experienced within this small social group have implications not only for individuals within a country but also for the wider culture. The concept of the family represents the entire society in relation to other countries, shaping ideas and prejudices regarding interpersonal culture and relationships. The representation of society's problems through cinema influences the formation of cultural schemas within the country producing the films and among the societies that view them.Keywords: interpersonal culture, representation, society, family, cultural schemas
Procedia PDF Downloads 683255 A Two Tailed Secretary Problem with Multiple Criteria
Authors: Alaka Padhye, S. P. Kane
Abstract:
The following study considers some variations made to the secretary problem (SP). In a multiple criteria secretary problem (MCSP), the selection of a unit is based on two independent characteristics. The units that appear before an observer are known say N, the best rank of a unit being N. A unit is selected, if it is better with respect to either first or second or both the characteristics. When the number of units is large and due to constraints like time and cost, the observer might want to stop earlier instead of inspecting all the available units. Let the process terminate at r2th unit where r13254 The Effect of Initial Sample Size and Increment in Simulation Samples on a Sequential Selection Approach
Authors: Mohammad H. Almomani
Abstract:
In this paper, we argue the effect of the initial sample size, and the increment in simulation samples on the performance of a sequential approach that used in selecting the top m designs when the number of alternative designs is very large. The sequential approach consists of two stages. In the first stage the ordinal optimization is used to select a subset that overlaps with the set of actual best k% designs with high probability. Then in the second stage the optimal computing budget is used to select the top m designs from the selected subset. We apply the selection approach on a generic example under some parameter settings, with a different choice of initial sample size and the increment in simulation samples, to explore the impacts on the performance of this approach. The results show that the choice of initial sample size and the increment in simulation samples does affect the performance of a selection approach.Keywords: Large Scale Problems, Optimal Computing Budget Allocation, ordinal optimization, simulation optimization
Procedia PDF Downloads 3553253 Machine Learning Approach for Yield Prediction in Semiconductor Production
Authors: Heramb Somthankar, Anujoy Chakraborty
Abstract:
This paper presents a classification study on yield prediction in semiconductor production using machine learning approaches. A complicated semiconductor production process is generally monitored continuously by signals acquired from sensors and measurement sites. A monitoring system contains a variety of signals, all of which contain useful information, irrelevant information, and noise. In the case of each signal being considered a feature, "Feature Selection" is used to find the most relevant signals. The open-source UCI SECOM Dataset provides 1567 such samples, out of which 104 fail in quality assurance. Feature extraction and selection are performed on the dataset, and useful signals were considered for further study. Afterward, common machine learning algorithms were employed to predict whether the signal yields pass or fail. The most relevant algorithm is selected for prediction based on the accuracy and loss of the ML model.Keywords: deep learning, feature extraction, feature selection, machine learning classification algorithms, semiconductor production monitoring, signal processing, time-series analysis
Procedia PDF Downloads 1093252 Low Overhead Dynamic Channel Selection with Cluster-Based Spatial-Temporal Station Reporting in Wireless Networks
Authors: Zeyad Abdelmageid, Xianbin Wang
Abstract:
Choosing the operational channel for a WLAN access point (AP) in WLAN networks has been a static channel assignment process initiated by the user during the deployment process of the AP, which fails to cope with the dynamic conditions of the assigned channel at the station side afterward. However, the dramatically growing number of Wi-Fi APs and stations operating in the unlicensed band has led to dynamic, distributed, and often severe interference. This highlights the urgent need for the AP to dynamically select the best overall channel of operation for the basic service set (BSS) by considering the distributed and changing channel conditions at all stations. Consequently, dynamic channel selection algorithms which consider feedback from the station side have been developed. Despite the significant performance improvement, existing channel selection algorithms suffer from very high feedback overhead. Feedback latency from the STAs, due to the high overhead, can cause the eventually selected channel to no longer be optimal for operation due to the dynamic sharing nature of the unlicensed band. This has inspired us to develop our own dynamic channel selection algorithm with reduced overhead through the proposed low-overhead, cluster-based station reporting mechanism. The main idea behind the cluster-based station reporting is the observation that STAs which are very close to each other tend to have very similar channel conditions. Instead of requesting each STA to report on every candidate channel while causing high overhead, the AP divides STAs into clusters then assigns each STA in each cluster one channel to report feedback on. With the proper design of the cluster based reporting, the AP does not lose any information about the channel conditions at the station side while reducing feedback overhead. The simulation results show equal performance and, at times, better performance with a fraction of the overhead. We believe that this algorithm has great potential in designing future dynamic channel selection algorithms with low overhead.Keywords: channel assignment, Wi-Fi networks, clustering, DBSCAN, overhead
Procedia PDF Downloads 1183251 Transport Mode Selection under Lead Time Variability and Emissions Constraint
Authors: Chiranjit Das, Sanjay Jharkharia
Abstract:
This study is focused on transport mode selection under lead time variability and emissions constraint. In order to reduce the carbon emissions generation due to transportation, organization has often faced a dilemmatic choice of transport mode selection since logistic cost and emissions reduction are complementary with each other. Another important aspect of transportation decision is lead-time variability which is least considered in transport mode selection problem. Thus, in this study, we provide a comprehensive mathematical based analytical model to decide transport mode selection under emissions constraint. We also extend our work through analysing the effect of lead time variability in the transport mode selection by a sensitivity analysis. In order to account lead time variability into the model, two identically normally distributed random variables are incorporated in this study including unit lead time variability and lead time demand variability. Therefore, in this study, we are addressing following questions: How the decisions of transport mode selection will be affected by lead time variability? How lead time variability will impact on total supply chain cost under carbon emissions? To accomplish these objectives, a total transportation cost function is developed including unit purchasing cost, unit transportation cost, emissions cost, holding cost during lead time, and penalty cost for stock out due to lead time variability. A set of modes is available to transport each node, in this paper, we consider only four transport modes such as air, road, rail, and water. Transportation cost, distance, emissions level for each transport mode is considered as deterministic and static in this paper. Each mode is having different emissions level depending on the distance and product characteristics. Emissions cost is indirectly affected by the lead time variability if there is any switching of transport mode from lower emissions prone transport mode to higher emissions prone transport mode in order to reduce penalty cost. We provide a numerical analysis in order to study the effectiveness of the mathematical model. We found that chances of stock out during lead time will be higher due to the higher variability of lead time and lad time demand. Numerical results show that penalty cost of air transport mode is negative that means chances of stock out zero, but, having higher holding and emissions cost. Therefore, air transport mode is only selected when there is any emergency order to reduce penalty cost, otherwise, rail and road transport is the most preferred mode of transportation. Thus, this paper is contributing to the literature by a novel approach to decide transport mode under emissions cost and lead time variability. This model can be extended by studying the effect of lead time variability under some other strategic transportation issues such as modal split option, full truck load strategy, and demand consolidation strategy etc.Keywords: carbon emissions, inventory theoretic model, lead time variability, transport mode selection
Procedia PDF Downloads 4343250 Industrial Process Mining Based on Data Pattern Modeling and Nonlinear Analysis
Authors: Hyun-Woo Cho
Abstract:
Unexpected events may occur with serious impacts on industrial process. This work utilizes a data representation technique to model and to analyze process data pattern for the purpose of diagnosis. In this work, the use of triangular representation of process data is evaluated using simulation process. Furthermore, the effect of using different pre-treatment techniques based on such as linear or nonlinear reduced spaces was compared. This work extracted the fault pattern in the reduced space, not in the original data space. The results have shown that the non-linear technique based diagnosis method produced more reliable results and outperforms linear method.Keywords: process monitoring, data analysis, pattern modeling, fault, nonlinear techniques
Procedia PDF Downloads 3873249 Authentication Based on Hand Movement by Low Dimensional Space Representation
Authors: Reut Lanyado, David Mendlovic
Abstract:
Most biological methods for authentication require special equipment and, some of them are easy to fake. We proposed a method for authentication based on hand movement while typing a sentence with a regular camera. This technique uses the full video of the hand, which is harder to fake. In the first phase, we tracked the hand joints in each frame. Next, we represented a single frame for each individual using our Pose Agnostic Rotation and Movement (PARM) dimensional space. Then, we indicated a full video of hand movement in a fixed low dimensional space using this method: Fixed Dimension Video by Interpolation Statistics (FDVIS). Finally, we identified each individual in the FDVIS representation using unsupervised clustering and supervised methods. Accuracy exceeds 96% for 80 individuals by using supervised KNN.Keywords: authentication, feature extraction, hand recognition, security, signal processing
Procedia PDF Downloads 1273248 A Novel Heuristic for Analysis of Large Datasets by Selecting Wrapper-Based Features
Authors: Bushra Zafar, Usman Qamar
Abstract:
Large data sample size and dimensions render the effectiveness of conventional data mining methodologies. A data mining technique are important tools for collection of knowledgeable information from variety of databases and provides supervised learning in the form of classification to design models to describe vital data classes while structure of the classifier is based on class attribute. Classification efficiency and accuracy are often influenced to great extent by noisy and undesirable features in real application data sets. The inherent natures of data set greatly masks its quality analysis and leave us with quite few practical approaches to use. To our knowledge first time, we present a new approach for investigation of structure and quality of datasets by providing a targeted analysis of localization of noisy and irrelevant features of data sets. Machine learning is based primarily on feature selection as pre-processing step which offers us to select few features from number of features as a subset by reducing the space according to certain evaluation criterion. The primary objective of this study is to trim down the scope of the given data sample by searching a small set of important features which may results into good classification performance. For this purpose, a heuristic for wrapper-based feature selection using genetic algorithm and for discriminative feature selection an external classifier are used. Selection of feature based on its number of occurrence in the chosen chromosomes. Sample dataset has been used to demonstrate proposed idea effectively. A proposed method has improved average accuracy of different datasets is about 95%. Experimental results illustrate that proposed algorithm increases the accuracy of prediction of different diseases.Keywords: data mining, generic algorithm, KNN algorithms, wrapper based feature selection
Procedia PDF Downloads 3163247 Investigating the Glass Ceiling Phenomenon: An Empirical Study of Glass Ceiling's Effects on Selection, Promotion and Female Effectiveness
Authors: Sharjeel Saleem
Abstract:
The glass ceiling has been a burning issue for many researchers. In this research, we examine gender of the BOD, training and development, workforce diversity, positive attitude towards women, and employee acts as antecedents of glass ceiling. Furthermore, we also look for effects of glass ceiling on likelihood of female selection and promotion and on female effectiveness. Multiple linear regression conducted on data drawn from different public and private sector organizations support our hypotheses. The research, however, is limited to Faisalabad city and only females from minority group are targeted here.Keywords: glass ceiling, stereotype attitudes, female effectiveness
Procedia PDF Downloads 2913246 Classification of Political Affiliations by Reduced Number of Features
Authors: Vesile Evrim, Aliyu Awwal
Abstract:
By the evolvement in technology, the way of expressing opinions switched the direction to the digital world. The domain of politics as one of the hottest topics of opinion mining research merged together with the behavior analysis for affiliation determination in text which constitutes the subject of this paper. This study aims to classify the text in news/blogs either as Republican or Democrat with the minimum number of features. As an initial set, 68 features which 64 are constituted by Linguistic Inquiry and Word Count (LIWC) features are tested against 14 benchmark classification algorithms. In the later experiments, the dimensions of the feature vector reduced based on the 7 feature selection algorithms. The results show that Decision Tree, Rule Induction and M5 Rule classifiers when used with SVM and IGR feature selection algorithms performed the best up to 82.5% accuracy on a given dataset. Further tests on a single feature and the linguistic based feature sets showed the similar results. The feature “function” as an aggregate feature of the linguistic category, is obtained as the most differentiating feature among the 68 features with 81% accuracy by itself in classifying articles either as Republican or Democrat.Keywords: feature selection, LIWC, machine learning, politics
Procedia PDF Downloads 3823245 Optimal Portfolio Selection under Treynor Ratio Using Genetic Algorithms
Authors: Imad Zeyad Ramadan
Abstract:
In this paper a genetic algorithm was developed to construct the optimal portfolio based on the Treynor method. The GA maximizes the Treynor ratio under budget constraint to select the best allocation of the budget for the companies in the portfolio. The results show that the GA was able to construct a conservative portfolio which includes companies from the three sectors. This indicates that the GA reduced the risk on the investor as it choose some companies with positive risks (goes with the market) and some with negative risks (goes against the market).Keywords: oOptimization, genetic algorithm, portfolio selection, Treynor method
Procedia PDF Downloads 4493244 Subject, Language, and Representation: Snyder's Poetics of Emptiness
Authors: Son Hyesook
Abstract:
This project explores the possibility of poetics of emptiness in the poetry of Gary Snyder, one of the most experimental American poets, interpreting his works as an expression of his Buddhist concept, emptiness. This philosophical term demonstrates the lack of intrinsic nature in all phenomena and the absence of an independent, perduring self. Snyder’s poetics of emptiness locates the extralinguistic reality, emptiness, within the contingent nexus of language itself instead of transcending or discarding it. Language, therefore, plays an important role in his poetry, a medium intentionally applied to the carrying out of this Buddhist telos. Snyder’s poetry is characterized by strangeness and disruptiveness of language as is often the case with Asian Zen discourses. The elision of a lyric ‘I’ and transitive verbs, for example, is his grammatic attempt to represent the illusory nature of the self. He replaces the solitary speaker with sparely modified, concrete but generic images to prevent any anthropocentric understanding of the world and to demonstrate human enactment into a harmonious interplay with other elements of life as a part of a vast web of interconnections, where everything is interrelated to every other thing. In many of his poems, Snyder employs grammatical and structural ellipses and paratactical construction to avoid a facile discursive relation and to help the reader illogically imagine the inexpressible, the void. Through various uses of typographical and semantical space, his poetry forces the reader to experience the ‘thought-pause’ and intuitively perceive things-as-they-are. Snyder enacts in his Poetics an alternative to postmodern perspectives on the subject, language, and representation, and revitalizes their skeptical look at any account of human agency and the possibility of language.Keywords: subject, language, representation, poetics of emptiness
Procedia PDF Downloads 1973243 Frequent Itemset Mining Using Rough-Sets
Authors: Usman Qamar, Younus Javed
Abstract:
Frequent pattern mining is the process of finding a pattern (a set of items, subsequences, substructures, etc.) that occurs frequently in a data set. It was proposed in the context of frequent itemsets and association rule mining. Frequent pattern mining is used to find inherent regularities in data. What products were often purchased together? Its applications include basket data analysis, cross-marketing, catalog design, sale campaign analysis, Web log (click stream) analysis, and DNA sequence analysis. However, one of the bottlenecks of frequent itemset mining is that as the data increase the amount of time and resources required to mining the data increases at an exponential rate. In this investigation a new algorithm is proposed which can be uses as a pre-processor for frequent itemset mining. FASTER (FeAture SelecTion using Entropy and Rough sets) is a hybrid pre-processor algorithm which utilizes entropy and rough-sets to carry out record reduction and feature (attribute) selection respectively. FASTER for frequent itemset mining can produce a speed up of 3.1 times when compared to original algorithm while maintaining an accuracy of 71%.Keywords: rough-sets, classification, feature selection, entropy, outliers, frequent itemset mining
Procedia PDF Downloads 4373242 Representation of the Iranian Community in the Videos of the Instagram Page of the World Health Organization Representative in Iran
Authors: Naeemeh Silvari
Abstract:
The phenomenon of the spread and epidemic of the corona virus caused many aspects of the social life of the people of the world to face various challenges. In this regard, and in order to improve the living conditions of the people, the World Health Organization has tried to publish the necessary instructions for its contacts in the world in the form of its media capacities. Considering the importance of cultural differences in the discussion of health communication and the distinct needs of people in different societies, some production contents were produced and published exclusively. This research has studied six videos published on the official page of the World Health Organization in Iran as a case study. The published content has the least semantic affinity with Iranian culture, and it has been tried to show a uniform image of the Middle East with the predominance of the image of the culture of the developing Arab countries.Keywords: corona, representation, semiotics, instagram, health communication
Procedia PDF Downloads 933241 Two Stage Fuzzy Methodology to Evaluate the Credit Risks of Investment Projects
Authors: O. Badagadze, G. Sirbiladze, I. Khutsishvili
Abstract:
The work proposes a decision support methodology for the credit risk minimization in selection of investment projects. The methodology provides two stages of projects’ evaluation. Preliminary selection of projects with minor credit risks is made using the Expertons Method. The second stage makes ranking of chosen projects using the Possibilistic Discrimination Analysis Method. The latter is a new modification of a well-known Method of Fuzzy Discrimination Analysis.Keywords: expert valuations, expertons, investment project risks, positive and negative discriminations, possibility distribution
Procedia PDF Downloads 6763240 Female Tenderness in Children’s Literature: A Content Analysis of Gender Depiction in Greek Preschool Picture Books
Authors: Theopoula Karanikolaou
Abstract:
During recent decades an increasing number of studies indicate the negative impact of gender stereotypes in various aspects of society as well as in everyday life. At the same time, children’s literature is considered an important factor of gender-role socialization as it provides young readers with socially accepted gender behavioral models. Using a content analysis approach, this research examines the female representations in Greek children’s literature published from 2009 to 2019. Results indicate that female characters are depicted as sensitive and tender both in texts and illustrations, traits that are almost absent in the male characters of the sample. Highlighting the emotional aspect of female characters in contrast with the restrained male attitude reproduces gender biases. Stereotypical gender representation in children’s literature cultivates further discrimination among men and women.Keywords: children's literature, female representation, gender socialization, gender studies
Procedia PDF Downloads 893239 Integration of Knowledge and Metadata for Complex Data Warehouses and Big Data
Authors: Jean Christian Ralaivao, Fabrice Razafindraibe, Hasina Rakotonirainy
Abstract:
This document constitutes a resumption of work carried out in the field of complex data warehouses (DW) relating to the management and formalization of knowledge and metadata. It offers a methodological approach for integrating two concepts, knowledge and metadata, within the framework of a complex DW architecture. The objective of the work considers the use of the technique of knowledge representation by description logics and the extension of Common Warehouse Metamodel (CWM) specifications. This will lead to a fallout in terms of the performance of a complex DW. Three essential aspects of this work are expected, including the representation of knowledge in description logics and the declination of this knowledge into consistent UML diagrams while respecting or extending the CWM specifications and using XML as pivot. The field of application is large but will be adapted to systems with heteroge-neous, complex and unstructured content and moreover requiring a great (re)use of knowledge such as medical data warehouses.Keywords: data warehouse, description logics, integration, knowledge, metadata
Procedia PDF Downloads 1383238 Solution of Logistics Center Selection Problem Using the Axiomatic Design Method
Authors: Fulya Zaralı, Harun Resit Yazgan
Abstract:
Logistics centers represent areas that all national and international logistics and activities related to logistics can be implemented by the various businesses. Logistics centers have a key importance in joining the transport stream and the transport system operations. Therefore, it is important where these centers are positioned to be effective and efficient and to show the expected performance of the centers. In this study, the location selection problem to position the logistics center is discussed. Alternative centers are evaluated according certain criteria. The most appropriate center is identified using the axiomatic design method.Keywords: axiomatic design, logistic center, facility location, information systems
Procedia PDF Downloads 3483237 Developing an Out-of-Distribution Generalization Model Selection Framework through Impurity and Randomness Measurements and a Bias Index
Authors: Todd Zhou, Mikhail Yurochkin
Abstract:
Out-of-distribution (OOD) detection is receiving increasing amounts of attention in the machine learning research community, boosted by recent technologies, such as autonomous driving and image processing. This newly-burgeoning field has called for the need for more effective and efficient methods for out-of-distribution generalization methods. Without accessing the label information, deploying machine learning models to out-of-distribution domains becomes extremely challenging since it is impossible to evaluate model performance on unseen domains. To tackle this out-of-distribution detection difficulty, we designed a model selection pipeline algorithm and developed a model selection framework with different impurity and randomness measurements to evaluate and choose the best-performing models for out-of-distribution data. By exploring different randomness scores based on predicted probabilities, we adopted the out-of-distribution entropy and developed a custom-designed score, ”CombinedScore,” as the evaluation criterion. This proposed score was created by adding labeled source information into the judging space of the uncertainty entropy score using harmonic mean. Furthermore, the prediction bias was explored through the equality of opportunity violation measurement. We also improved machine learning model performance through model calibration. The effectiveness of the framework with the proposed evaluation criteria was validated on the Folktables American Community Survey (ACS) datasets.Keywords: model selection, domain generalization, model fairness, randomness measurements, bias index
Procedia PDF Downloads 1243236 Fusion of Shape and Texture for Unconstrained Periocular Authentication
Authors: D. R. Ambika, K. R. Radhika, D. Seshachalam
Abstract:
Unconstrained authentication is an important component for personal automated systems and human-computer interfaces. Existing solutions mostly use face as the primary object of analysis. The performance of face-based systems is largely determined by the extent of deformation caused in the facial region and amount of useful information available in occluded face images. Periocular region is a useful portion of face with discriminative ability coupled with resistance to deformation. A reliable portion of periocular area is available for occluded images. The present work demonstrates that joint representation of periocular texture and periocular structure provides an effective expression and poses invariant representation. The proposed methodology provides an effective and compact description of periocular texture and shape. The method is tested over four benchmark datasets exhibiting varied acquisition conditions.Keywords: periocular authentication, Zernike moments, LBP variance, shape and texture fusion
Procedia PDF Downloads 2783235 Weighted Rank Regression with Adaptive Penalty Function
Authors: Kang-Mo Jung
Abstract:
The use of regularization for statistical methods has become popular. The least absolute shrinkage and selection operator (LASSO) framework has become the standard tool for sparse regression. However, it is well known that the LASSO is sensitive to outliers or leverage points. We consider a new robust estimation which is composed of the weighted loss function of the pairwise difference of residuals and the adaptive penalty function regulating the tuning parameter for each variable. Rank regression is resistant to regression outliers, but not to leverage points. By adopting a weighted loss function, the proposed method is robust to leverage points of the predictor variable. Furthermore, the adaptive penalty function gives us good statistical properties in variable selection such as oracle property and consistency. We develop an efficient algorithm to compute the proposed estimator using basic functions in program R. We used an optimal tuning parameter based on the Bayesian information criterion (BIC). Numerical simulation shows that the proposed estimator is effective for analyzing real data set and contaminated data.Keywords: adaptive penalty function, robust penalized regression, variable selection, weighted rank regression
Procedia PDF Downloads 4743234 Firm Level Productivity Heterogeneity and Export Behavior: Evidence from UK
Authors: Umut Erksan Senalp
Abstract:
The aim of this study is to examine the link between firm level productivity heterogeneity and firm’s decision to export. Thus, we test the self selection hypothesis which suggests only more productive firms self select themselves to export markets. We analyze UK manufacturing sector by using firm-level data for the period 2003-2011. Although our preliminary results suggest that exporters outperform non-exporters when we pool all manufacturing industries, when we examine each industry individually, we find that self-selection hypothesis does not hold for each industries.Keywords: total factor productivity, firm heterogeneity, international trade, decision to export
Procedia PDF Downloads 3613233 Site Selection of CNG Station by Using FUZZY-AHP Model (Case Study: Gas Zone 4, Tehran City Iran)
Authors: Hamidrza Joodaki
Abstract:
The most complex issue in urban land use planning is site selection that needs to assess the verity of elements and factors. Multi Criteria Decision Making (MCDM) methods are the best approach to deal with complex problems. In this paper, combination of the analytical hierarchy process (AHP) model and FUZZY logic was used as MCDM methods to select the best site for gas station in the 4th gas zone of Tehran. The first and the most important step in FUZZY-AHP model is selection of criteria and sub-criteria. Population, accessibility, proximity and natural disasters were considered as the main criteria in this study. After choosing the criteria, they were weighted based on AHP by EXPERT CHOICE software, and FUZZY logic was used to enhance accuracy and to approach the reality. After these steps, criteria layers were produced and weighted based on FUZZY-AHP model in GIS. Finally, through ARC GIS software, the layers were integrated and the 4th gas zone in TEHRAN was selected as the best site to locate gas station.Keywords: multiple criteria decision making (MCDM), analytic hierarchy process (AHP), FUZZY logic, geographic information system (GIS)
Procedia PDF Downloads 3613232 Arithmetic Operations Based on Double Base Number Systems
Authors: K. Sanjayani, C. Saraswathy, S. Sreenivasan, S. Sudhahar, D. Suganya, K. S. Neelukumari, N. Vijayarangan
Abstract:
Double Base Number System (DBNS) is an imminent system of representing a number using two bases namely 2 and 3, which has its application in Elliptic Curve Cryptography (ECC) and Digital Signature Algorithm (DSA).The previous binary method representation included only base 2. DBNS uses an approximation algorithm namely, Greedy Algorithm. By using this algorithm, the number of digits required to represent a larger number is less when compared to the standard binary method that uses base 2 algorithms. Hence, the computational speed is increased and time being reduced. The standard binary method uses binary digits 0 and 1 to represent a number whereas the DBNS method uses binary digit 1 alone to represent any number (canonical form). The greedy algorithm uses two ways to represent the number, one is by using only the positive summands and the other is by using both positive and negative summands. In this paper, arithmetic operations are used for elliptic curve cryptography. Elliptic curve discrete logarithm problem is the foundation for most of the day to day elliptic curve cryptography. This appears to be a momentous hard slog compared to digital logarithm problem. In elliptic curve digital signature algorithm, the key generation requires 160 bit of data by usage of standard binary representation. Whereas, the number of bits required generating the key can be reduced with the help of double base number representation. In this paper, a new technique is proposed to generate key during encryption and extraction of key in decryption.Keywords: cryptography, double base number system, elliptic curve cryptography, elliptic curve digital signature algorithm
Procedia PDF Downloads 396