Search results for: pedestrian target selection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4926

Search results for: pedestrian target selection

4806 Evaluation and Selection of Construction Contractors by Polish Public Clients

Authors: Kozik Renata, Leśniak Agnieszka, Plebankiewicz Edyta

Abstract:

Contracting authorities in the public sector are obligated to apply the principles provided for in the Polish law for the evaluation and selection of contractors. To analyze the methods of contractors, applied in practice by public clients, the notices of contract award results for construction works were analyzed. The analysis shows that the procedure selected more and more often is open to competitive bidding, where the assessment of the competence of contractors is not very precise, as well as non-competitive bidding, i.e. single source procurement. The share of procurement procedures, where the only criterion is price, is increasing. The solution to the problems existing here might be the introduction of one of the forms of pre-selection of contractors. The article also briefly discusses verification systems for companies applying for public contracts used in EU countries.

Keywords: certification, contractors selection, open tendering, public investors

Procedia PDF Downloads 257
4805 Manufacturing Facility Location Selection: A Numercal Taxonomy Approach

Authors: Seifoddini Hamid, Mardikoraeem Mahsa, Ghorayshi Roya

Abstract:

Manufacturing facility location selection is an important strategic decision for many industrial corporations. In this paper, a new approach to the manufacturing location selection problem is proposed. In this approach, cluster analysis is employed to identify suitable manufacturing locations based on economic, social, environmental, and political factors. These factors are quantified using the existing real world data.

Keywords: manufacturing facility, manufacturing sites, real world data

Procedia PDF Downloads 537
4804 Proposal of a Model Supporting Decision-Making on Information Security Risk Treatment

Authors: Ritsuko Kawasaki, Takeshi Hiromatsu

Abstract:

Management is required to understand all information security risks within an organization, and to make decisions on which information security risks should be treated in what level by allocating how much amount of cost. However, such decision-making is not usually easy, because various measures for risk treatment must be selected with the suitable application levels. In addition, some measures may have objectives conflicting with each other. It also makes the selection difficult. Therefore, this paper provides a model which supports the selection of measures by applying multi-objective analysis to find an optimal solution. Additionally, a list of measures is also provided to make the selection easier and more effective without any leakage of measures.

Keywords: information security risk treatment, selection of risk measures, risk acceptance, multi-objective optimization

Procedia PDF Downloads 348
4803 Methodology for the Selection of Chemical Textile Products

Authors: Oscar F. Toro, Alexia Pardo Figueroa, Brigitte M. Larico

Abstract:

The development of new processes in the textile industry entails designing methodologies to select adequate supplies that fit these new processes requirements. This paper presents a methodology to select chemicals that fulfill a new process technical specifications. The proposed methodology involves three major phases: (1) Data collection of chemical products, (2) Qualitative pre-selection and (3) Laboratory tests. We have applied this methodology to the selection of a binder which will form a protective film above the textile fibers and bond them. Our findings were that, there exist five possible products that can be used in our new process: Arkofil, Elvanol, Size plus A, Size plus AC and Starch. This new methodology has both qualitative and experimental variables, and can be used to select supplies for new textile processes.

Keywords: binder, chemical products, selection methodology, textile supplies, textile fiber

Procedia PDF Downloads 264
4802 Positive Bias and Length Bias in Deep Neural Networks for Premises Selection

Authors: Jiaqi Huang, Yuheng Wang

Abstract:

Premises selection, the task of selecting a set of axioms for proving a given conjecture, is a major bottleneck in automated theorem proving. An array of deep-learning-based methods has been established for premises selection, but a perfect performance remains challenging. Our study examines the inaccuracy of deep neural networks in premises selection. Through training network models using encoded conjecture and axiom pairs from the Mizar Mathematical Library, two potential biases are found: the network models classify more premises as necessary than unnecessary, referred to as the ‘positive bias’, and the network models perform better in proving conjectures that paired with more axioms, referred to as ‘length bias’. The ‘positive bias’ and ‘length bias’ discovered could inform the limitation of existing deep neural networks.

Keywords: automated theorem proving, premises selection, deep learning, interpreting deep learning

Procedia PDF Downloads 139
4801 A Survey of Feature Selection and Feature Extraction Techniques in Machine Learning

Authors: Samina Khalid, Shamila Nasreen

Abstract:

Dimensionality reduction as a preprocessing step to machine learning is effective in removing irrelevant and redundant data, increasing learning accuracy, and improving result comprehensibility. However, the recent increase of dimensionality of data poses a severe challenge to many existing feature selection and feature extraction methods with respect to efficiency and effectiveness. In the field of machine learning and pattern recognition, dimensionality reduction is important area, where many approaches have been proposed. In this paper, some widely used feature selection and feature extraction techniques have analyzed with the purpose of how effectively these techniques can be used to achieve high performance of learning algorithms that ultimately improves predictive accuracy of classifier. An endeavor to analyze dimensionality reduction techniques briefly with the purpose to investigate strengths and weaknesses of some widely used dimensionality reduction methods is presented.

Keywords: age related macular degeneration, feature selection feature subset selection feature extraction/transformation, FSA’s, relief, correlation based method, PCA, ICA

Procedia PDF Downloads 453
4800 A Students' Ability Analysis Methods, Devices, Electronic Equipment and Storage Media Design

Authors: Dequn Teng, Tianshuo Yang, Mingrui Wang, Qiuyu Chen, Xiao Wang, Katie Atkinson

Abstract:

Currently, many students are kind of at a loss in the university due to the complex environment within the campus, where every information within the campus is isolated with fewer interactions with each other. However, if the on-campus resources are gathered and combined with the artificial intelligence modelling techniques, there will be a bridge for not only students in understanding themselves, and the teachers will understand students in providing a much efficient approach in education. The objective of this paper is to provide a competency level analysis method, apparatus, electronic equipment, and storage medium. It uses a user’s target competency level analysis model from a plurality of predefined candidate competency level analysis models by obtaining a user’s promotion target parameters, promotion target parameters including at least one of the following parameters: target profession, target industry, and the target company, according to the promotion target parameters. According to the parameters, the model analyzes the user’s ability level, determines the user’s ability level, realizes the quantitative and personalized analysis of the user’s ability level, and helps the user to objectively position his ability level.

Keywords: artificial intelligence, model, university, education, recommendation system, evaluation, job hunting

Procedia PDF Downloads 117
4799 Two-Sided Information Dissemination in Takeovers: Disclosure and Media

Authors: Eda Orhun

Abstract:

Purpose: This paper analyzes a target firm’s decision to voluntarily disclose information during a takeover event and the effect of such disclosures on the outcome of the takeover. Such voluntary disclosures especially in the form of earnings forecasts made around takeover events may affect shareholders’ decisions about the target firm’s value and in return takeover result. This study aims to shed light on this question. Design/methodology/approach: The paper tries to understand the role of voluntary disclosures by target firms during a takeover event in the likelihood of takeover success both theoretically and empirically. A game-theoretical model is set up to analyze the voluntary disclosure decision of a target firm to inform the shareholders about its real worth. The empirical implication of model is tested by employing binary outcome models where the disclosure variable is obtained by identifying the target firms in the sample that provide positive news by issuing increasing management earnings forecasts. Findings: The model predicts that a voluntary disclosure of positive information by the target decreases the likelihood that the takeover succeeds. The empirical analysis confirms this prediction by showing that positive earnings forecasts by target firms during takeover events increase the probability of takeover failure. Overall, it is shown that information dissemination through voluntary disclosures by target firms is an important factor affecting takeover outcomes. Originality/Value: This study is the first to the author's knowledge that studies the impact of voluntary disclosures by the target firm during a takeover event on the likelihood of takeover success. The results contribute to information economics, corporate finance and M&As literatures.

Keywords: takeovers, target firm, voluntary disclosures, earnings forecasts, takeover success

Procedia PDF Downloads 290
4798 Partner Selection for Innovation Projects Related to New Product Concept Design

Authors: Odd Jarl Borch, Marina Z. Solesvik

Abstract:

The paper analyses partner selection approaches related to large scale R&D-based innovation projects at the different stages of development. We emphasize innovation projects in the maritime value chain and how partners are selected to improve quality according to high spec customer demands, and to reduce investment costs on new production technology such as advanced offshore service vessels. We elaborate on the differences in innovation approach and especially the role that purposive inflows and outflows of knowledge from external partners may be used to accelerate internal innovation. We present three cases related to different projects in terms of specificity and scope. We explore how the partner selection criteria change over time when the goals move from wide scope to a very specific R&D tasks.

Keywords: partner selection, innovation, offshore industry, concept design

Procedia PDF Downloads 488
4797 Multi-Criteria Test Case Selection Using Ant Colony Optimization

Authors: Niranjana Devi N.

Abstract:

Test case selection is to select the subset of only the fit test cases and remove the unfit, ambiguous, redundant, unnecessary test cases which in turn improve the quality and reduce the cost of software testing. Test cases optimization is the problem of finding the best subset of test cases from a pool of the test cases to be audited. It will meet all the objectives of testing concurrently. But most of the research have evaluated the fitness of test cases only on single parameter fault detecting capability and optimize the test cases using a single objective. In the proposed approach, nine parameters are considered for test case selection and the best subset of parameters for test case selection is obtained using Interval Type-2 Fuzzy Rough Set. Test case selection is done in two stages. The first stage is the fuzzy entropy-based filtration technique, used for estimating and reducing the ambiguity in test case fitness evaluation and selection. The second stage is the ant colony optimization-based wrapper technique with a forward search strategy, employed to select test cases from the reduced test suite of the first stage. The results are evaluated using the Coverage parameters, Precision, Recall, F-Measure, APSC, APDC, and SSR. The experimental evaluation demonstrates that by this approach considerable computational effort can be avoided.

Keywords: ant colony optimization, fuzzy entropy, interval type-2 fuzzy rough set, test case selection

Procedia PDF Downloads 630
4796 An Integrated DEMATEL-QFD Model for Medical Supplier Selection

Authors: Mehtap Dursun, Zeynep Şener

Abstract:

Supplier selection is considered as one of the most critical issues encountered by operations and purchasing managers to sharpen the company’s competitive advantage. In this paper, a novel fuzzy multi-criteria group decision making approach integrating quality function deployment (QFD) and decision making trial and evaluation laboratory (DEMATEL) method is proposed for supplier selection. The proposed methodology enables to consider the impacts of inner dependence among supplier assessment criteria. A house of quality (HOQ) which translates purchased product features into supplier assessment criteria is built using the weights obtained by DEMATEL approach to determine the desired levels of supplier assessment criteria. Supplier alternatives are ranked by a distance-based method.

Keywords: DEMATEL, group decision making, QFD, supplier selection

Procedia PDF Downloads 400
4795 Understanding Walkability in the Libyan Urban Space: Policies, Perceptions and Smart Design for Sustainable Tripoli

Authors: A. Abdulla Khairi Mohamed, Mohamed Gamal Abdelmonem, Gehan Selim

Abstract:

Walkability in civic and public spaces in Libyan cities is challenging due to the lack of accessibility design, informal merging into car traffic, and the general absence of adequate urban and space planning. The lack of accessible and pedestrian-friendly public spaces in Libyan cities has emerged as a major concern for the government if it is to develop smart and sustainable spaces for the 21st century. A walkable urban space has become a driver for urban development and redistribution of land use to ensure pedestrian and walkable routes between sites of living and workplaces. The characteristics of urban open space in the city centre play a main role in attracting people to walk when attending their daily needs, recreation and daily sports. There is significant gap in the understanding of perceptions, feasibility and capabilities of Libyan urban space to accommodate enhance or support the smart design of a walkable pedestrian-friendly environment that is safe and accessible to everyone. The paper aims to undertake observations of walkability and walkable space in the city of Tripoli as a benchmark for Libyan cities; assess the validity and consistency of the seven principal aspects of smart design, safety, accessibility and 51 factors that affect the walkability in open urban space in Tripoli, through the analysis of 10 local urban spaces experts (town planner, architect, transport engineer and urban designer); and explore user groups’ perceptions of accessibility in walkable spaces in Libyan cities through questionnaires. The study sampled 200 respondents in 2015-16. The results of this study are useful for urban planning, to classify the walkable urban space elements which affect to improve the level of walkability in the Libyan cities and create sustainable and liveable urban spaces.

Keywords: walkability, sustainability, liveability, accessibility

Procedia PDF Downloads 405
4794 Effects of Earthquake Induced Debris to Pedestrian and Community Street Network Resilience

Authors: Al-Amin, Huanjun Jiang, Anayat Ali

Abstract:

Reinforced concrete frames (RC), especially Ordinary RC frames, are prone to structural failures/collapse during seismic events, leading to a large proportion of debris from the structures, which obstructs adjacent areas, including streets. These blocked areas severely impede post-earthquake resilience. This study uses computational simulation (FEM) to investigate the amount of debris generated by the seismic collapse of an ordinary reinforced concrete moment frame building and its effects on the adjacent pedestrian and road network. A three-story ordinary reinforced concrete frame building, primarily designed for gravity load and earthquake resistance, was selected for analysis. Sixteen different ground motions were applied and scaled up until the total collapse of the tested building to evaluate the failure mode under various seismic events. Four types of collapse direction were identified through the analysis, namely aligned (positive and negative) and skewed (positive and negative), with aligned collapse being more predominant than skewed cases. The amount and distribution of debris around the collapsed building were assessed to investigate the interaction between collapsed buildings and adjacent street networks. An interaction was established between a building that collapsed in an aligned direction and the adjacent pedestrian walkway and narrow street located in an unplanned old city. The FEM model was validated against an existing shaking table test. The presented results can be utilized to simulate the interdependency between the debris generated from the collapse of seismic-prone buildings and the resilience of street networks. These findings provide insights for better disaster planning and resilient infrastructure development in earthquake-prone regions.

Keywords: building collapse, earthquake-induced debris, ORC moment resisting frame, street network

Procedia PDF Downloads 57
4793 A Review of Feature Selection Methods Implemented in Neural Stem Cells

Authors: Natasha Petrovska, Mirjana Pavlovic, Maria M. Larrondo-Petrie

Abstract:

Neural stem cells (NSCs) are multi-potent, self-renewing cells that generate new neurons. Three subtypes of NSCs can be separated regarding the stages of NSC lineage: quiescent neural stem cells (qNSCs), activated neural stem cells (aNSCs) and neural progenitor cells (NPCs), but their gene expression signatures are not utterly understood yet. Single-cell examinations have started to elucidate the complex structure of NSC populations. Nevertheless, there is a lack of thorough molecular interpretation of the NSC lineage heterogeneity and an increasing need for tools to analyze and improve the efficiency and correctness of single-cell sequencing data. Feature selection and ordering can identify and classify the gene expression signatures of these subtypes and can discover novel subpopulations during the NSCs activation and differentiation processes. The aim here is to review the implementation of the feature selection technique on NSC subtypes and the classification techniques that have been used for the identification of gene expression signatures.

Keywords: feature selection, feature similarity, neural stem cells, genes, feature selection methods

Procedia PDF Downloads 106
4792 Accidents Involving Pedestrians Walking along with/against Traffic: An Evaluation of Crash Characteristics and Injuries

Authors: Chih-Wei Pai, Rong-Chang Jou

Abstract:

Using A1 A2 police-reported accident data for years 2003–2010 in Taiwan, the paper examines anatomic injuries and crash characteristics specific to pedestrians in “facing traffic” and “back to traffic” crashes. There were 2768 and 7558 accidents involving pedestrians walking along with/against traffic respectively. Injuries sustained by pedestrians and crash characteristics in these two crash types were compared with those in other crash types (nearside crash, nearside dart-out crash, offside crash, offside dart-out crash). Main findings include that “back to traffic” crashes resulted in more severe injuries, and pedestrians in “back to traffic” crashes had increased head, neck, and spine injuries than those in other crash types; and there was an elevated risk of head injuries in unlit darkness and NBU (non-built-up) roadways. Several crash features (e.g. unlit darkness, overtaking maneuvers, phone use by pedestrians and drivers, intoxicated drivers) appear to be over-involved in “back to traffic” crashes. The implications of the research findings regarding pedestrian/driver education, enforcement, and remedial engineering design are discussed.

Keywords: pedestrian accident, crash characteristics, injury, facing traffic, back to traffic

Procedia PDF Downloads 334
4791 A Fuzzy-Rough Feature Selection Based on Binary Shuffled Frog Leaping Algorithm

Authors: Javad Rahimipour Anaraki, Saeed Samet, Mahdi Eftekhari, Chang Wook Ahn

Abstract:

Feature selection and attribute reduction are crucial problems, and widely used techniques in the field of machine learning, data mining and pattern recognition to overcome the well-known phenomenon of the Curse of Dimensionality. This paper presents a feature selection method that efficiently carries out attribute reduction, thereby selecting the most informative features of a dataset. It consists of two components: 1) a measure for feature subset evaluation, and 2) a search strategy. For the evaluation measure, we have employed the fuzzy-rough dependency degree (FRFDD) of the lower approximation-based fuzzy-rough feature selection (L-FRFS) due to its effectiveness in feature selection. As for the search strategy, a modified version of a binary shuffled frog leaping algorithm is proposed (B-SFLA). The proposed feature selection method is obtained by hybridizing the B-SFLA with the FRDD. Nine classifiers have been employed to compare the proposed approach with several existing methods over twenty two datasets, including nine high dimensional and large ones, from the UCI repository. The experimental results demonstrate that the B-SFLA approach significantly outperforms other metaheuristic methods in terms of the number of selected features and the classification accuracy.

Keywords: binary shuffled frog leaping algorithm, feature selection, fuzzy-rough set, minimal reduct

Procedia PDF Downloads 185
4790 Classification of Random Doppler-Radar Targets during the Surveillance Operations

Authors: G. C. Tikkiwal, Mukesh Upadhyay

Abstract:

During the surveillance operations at war or peace time, the Radar operator gets a scatter of targets over the screen. This may be a tracked vehicle like tank vis-à-vis T72, BMP etc, or it may be a wheeled vehicle like ALS, TATRA, 2.5Tonne, Shaktiman or moving the army, moving convoys etc. The radar operator selects one of the promising targets into single target tracking (STT) mode. Once the target is locked, the operator gets a typical audible signal into his headphones. With reference to the gained experience and training over the time, the operator then identifies the random target. But this process is cumbersome and is solely dependent on the skills of the operator, thus may lead to misclassification of the object. In this paper, we present a technique using mathematical and statistical methods like fast fourier transformation (FFT) and principal component analysis (PCA) to identify the random objects. The process of classification is based on transforming the audible signature of target into music octave-notes. The whole methodology is then automated by developing suitable software. This automation increases the efficiency of identification of the random target by reducing the chances of misclassification. This whole study is based on live data.

Keywords: radar target, FFT, principal component analysis, eigenvector, octave-notes, DSP

Procedia PDF Downloads 367
4789 An Improved Sub-Nyquist Sampling Jamming Method for Deceiving Inverse Synthetic Aperture Radar

Authors: Yanli Qi, Ning Lv, Jing Li

Abstract:

Sub-Nyquist sampling jamming method (SNSJ) is a well known deception jamming method for inverse synthetic aperture radar (ISAR). However, the anti-decoy of the SNSJ method performs easier since the amplitude of the false-target images are weaker than the real-target image; the false-target images always lag behind the real-target image, and all targets are located in the same cross-range. In order to overcome the drawbacks mentioned above, a simple modulation based on SNSJ (M-SNSJ) is presented in this paper. The method first uses amplitude modulation factor to make the amplitude of the false-target images consistent with the real-target image, then uses the down-range modulation factor and cross-range modulation factor to make the false-target images move freely in down-range and cross-range, respectively, thus the capacity of deception is improved. Finally, the simulation results on the six available combinations of three modulation factors are given to illustrate our conclusion.

Keywords: inverse synthetic aperture radar (ISAR), deceptive jamming, Sub-Nyquist sampling jamming method (SNSJ), modulation based on Sub-Nyquist sampling jamming method (M-SNSJ)

Procedia PDF Downloads 182
4788 Multi-Sensor Target Tracking Using Ensemble Learning

Authors: Bhekisipho Twala, Mantepu Masetshaba, Ramapulana Nkoana

Abstract:

Multiple classifier systems combine several individual classifiers to deliver a final classification decision. However, an increasingly controversial question is whether such systems can outperform the single best classifier, and if so, what form of multiple classifiers system yields the most significant benefit. Also, multi-target tracking detection using multiple sensors is an important research field in mobile techniques and military applications. In this paper, several multiple classifiers systems are evaluated in terms of their ability to predict a system’s failure or success for multi-sensor target tracking tasks. The Bristol Eden project dataset is utilised for this task. Experimental and simulation results show that the human activity identification system can fulfill requirements of target tracking due to improved sensors classification performances with multiple classifier systems constructed using boosting achieving higher accuracy rates.

Keywords: single classifier, ensemble learning, multi-target tracking, multiple classifiers

Procedia PDF Downloads 230
4787 Ground Motion Modeling Using the Least Absolute Shrinkage and Selection Operator

Authors: Yildiz Stella Dak, Jale Tezcan

Abstract:

Ground motion models that relate a strong motion parameter of interest to a set of predictive seismological variables describing the earthquake source, the propagation path of the seismic wave, and the local site conditions constitute a critical component of seismic hazard analyses. When a sufficient number of strong motion records are available, ground motion relations are developed using statistical analysis of the recorded ground motion data. In regions lacking a sufficient number of recordings, a synthetic database is developed using stochastic, theoretical or hybrid approaches. Regardless of the manner the database was developed, ground motion relations are developed using regression analysis. Development of a ground motion relation is a challenging process which inevitably requires the modeler to make subjective decisions regarding the inclusion criteria of the recordings, the functional form of the model and the set of seismological variables to be included in the model. Because these decisions are critically important to the validity and the applicability of the model, there is a continuous interest on procedures that will facilitate the development of ground motion models. This paper proposes the use of the Least Absolute Shrinkage and Selection Operator (LASSO) in selecting the set predictive seismological variables to be used in developing a ground motion relation. The LASSO can be described as a penalized regression technique with a built-in capability of variable selection. Similar to the ridge regression, the LASSO is based on the idea of shrinking the regression coefficients to reduce the variance of the model. Unlike ridge regression, where the coefficients are shrunk but never set equal to zero, the LASSO sets some of the coefficients exactly to zero, effectively performing variable selection. Given a set of candidate input variables and the output variable of interest, LASSO allows ranking the input variables in terms of their relative importance, thereby facilitating the selection of the set of variables to be included in the model. Because the risk of overfitting increases as the ratio of the number of predictors to the number of recordings increases, selection of a compact set of variables is important in cases where a small number of recordings are available. In addition, identification of a small set of variables can improve the interpretability of the resulting model, especially when there is a large number of candidate predictors. A practical application of the proposed approach is presented, using more than 600 recordings from the National Geospatial-Intelligence Agency (NGA) database, where the effect of a set of seismological predictors on the 5% damped maximum direction spectral acceleration is investigated. The set of candidate predictors considered are Magnitude, Rrup, Vs30. Using LASSO, the relative importance of the candidate predictors has been ranked. Regression models with increasing levels of complexity were constructed using one, two, three, and four best predictors, and the models’ ability to explain the observed variance in the target variable have been compared. The bias-variance trade-off in the context of model selection is discussed.

Keywords: ground motion modeling, least absolute shrinkage and selection operator, penalized regression, variable selection

Procedia PDF Downloads 302
4786 An Analysis of the Differences between Three Levels Water Polo Players Based on Indicators of Efficiency

Authors: Mladen Hraste, Igor Jelaska, Ivan Granic

Abstract:

The scope of this research is the identification and explanation of differences of three levels of water polo players in some parameters of effectiveness. The sample for this study was 132 matches of the Adriatic Water Polo League in the 2013/14 competition season. Using the Kruskal-Wallis test and multiple comparisons of mean ranks for all groups at the significance level of α=0, 05, the hypothesis that there are significant differences between groups of respondents in ten of the seventeen variables of effectiveness was confirmed. There is a reasonable possibility that the differences are caused by the degree of learned and implemented tactical knowledge, the degree of scoring ability and the best selection for certain roles in the team. The results of this study can be applied to selection of teams and players, for the selection of the appropriate match concept and for organizing training process.

Keywords: scoring abilities, selection, tactical knowledge, water polo effectiveness

Procedia PDF Downloads 474
4785 Recruitment Model (FSRM) for Faculty Selection Based on Fuzzy Soft

Authors: G. S. Thakur

Abstract:

This paper presents a Fuzzy Soft Recruitment Model (FSRM) for faculty selection of MHRD technical institutions. The selection criteria are based on 4-tier flexible structure in the institutions. The Advisory Committee on Faculty Recruitment (ACoFAR) suggested nine criteria for faculty in the proposed FSRM. The model Fuzzy Soft is proposed with consultation of ACoFAR based on selection criteria. The Fuzzy Soft distance similarity measures are applied for finding best faculty from the applicant pool.

Keywords: fuzzy soft set, fuzzy sets, fuzzy soft distance, fuzzy soft similarity measures, ACoFAR

Procedia PDF Downloads 314
4784 Using Finite Element to Predict Failure of Light Weight Bridges Due to Vehicles Impact: Case Study

Authors: Amin H. Almasria, Rajai Z. Alrousanb, Al-Harith Manasrah

Abstract:

The collapse of a light weight pedestrian bridges due to vehicle collision is investigated and studied in detail using a dynamic nonlinear finite element analysis. Typical bridge widely used in Jordan is studied and modeled under truck collision using one dimensional beam finite element in order to minimize analysis time due to the dynamic nature of the problem. Truck collision with the bridge is simulated at different speeds and locations of collisions using dynamic explicit finite element scheme with material nonlinearity taken into account. Energy absorption of bridge is investigated through principle of energy conservation, where truck kinetic energy is assumed to be stored in the bridge as strain energy. Weak failure points in the bridges were identified, and modifications are proposed in order to strengthen the bridge structure and prevent total collapse. The proposed design modifications on bridge structure were successful in allowing the bridge to fail locally rather than globally and expected to help in saving lives.

Keywords: finite element method, dynamic impact, pedestrian bridges, strain energy, collapse failure

Procedia PDF Downloads 594
4783 Analytic Network Process in Location Selection and Its Application to a Real Life Problem

Authors: Eylem Koç, Hasan Arda Burhan

Abstract:

Location selection presents a crucial decision problem in today’s business world where strategic decision making processes have critical importance. Thus, location selection has strategic importance for companies in boosting their strength regarding competition, increasing corporate performances and efficiency in addition to lowering production and transportation costs. A right choice in location selection has a direct impact on companies’ commercial success. In this study, a store location selection problem of Carglass Turkey which operates in vehicle glass branch is handled. As this problem includes both tangible and intangible criteria, Analytic Network Process (ANP) was accepted as the main methodology. The model consists of control hierarchy and BOCR subnetworks which include clusters of actors, alternatives and criteria. In accordance with the management’s choices, five different locations were selected. In addition to the literature review, a strict cooperation with the actor group was ensured and maintained while determining the criteria and during whole process. Obtained results were presented to the management as a report and its feasibility was confirmed accordingly.

Keywords: analytic network process (ANP), BOCR, multi-actor decision making, multi-criteria decision making, real-life problem, location selection

Procedia PDF Downloads 440
4782 Prediction of MicroRNA-Target Gene by Machine Learning Algorithms in Lung Cancer Study

Authors: Nilubon Kurubanjerdjit, Nattakarn Iam-On, Ka-Lok Ng

Abstract:

MicroRNAs are small non-coding RNA found in many different species. They play crucial roles in cancer such as biological processes of apoptosis and proliferation. The identification of microRNA-target genes can be an essential first step towards to reveal the role of microRNA in various cancer types. In this paper, we predict miRNA-target genes for lung cancer by integrating prediction scores from miRanda and PITA algorithms used as a feature vector of miRNA-target interaction. Then, machine-learning algorithms were implemented for making a final prediction. The approach developed in this study should be of value for future studies into understanding the role of miRNAs in molecular mechanisms enabling lung cancer formation.

Keywords: microRNA, miRNAs, lung cancer, machine learning, Naïve Bayes, SVM

Procedia PDF Downloads 366
4781 Application of Principle Component Analysis for Classification of Random Doppler-Radar Targets during the Surveillance Operations

Authors: G. C. Tikkiwal, Mukesh Upadhyay

Abstract:

During the surveillance operations at war or peace time, the Radar operator gets a scatter of targets over the screen. This may be a tracked vehicle like tank vis-à-vis T72, BMP etc, or it may be a wheeled vehicle like ALS, TATRA, 2.5Tonne, Shaktiman or moving army, moving convoys etc. The Radar operator selects one of the promising targets into Single Target Tracking (STT) mode. Once the target is locked, the operator gets a typical audible signal into his headphones. With reference to the gained experience and training over the time, the operator then identifies the random target. But this process is cumbersome and is solely dependent on the skills of the operator, thus may lead to misclassification of the object. In this paper we present a technique using mathematical and statistical methods like Fast Fourier Transformation (FFT) and Principal Component Analysis (PCA) to identify the random objects. The process of classification is based on transforming the audible signature of target into music octave-notes. The whole methodology is then automated by developing suitable software. This automation increases the efficiency of identification of the random target by reducing the chances of misclassification. This whole study is based on live data.

Keywords: radar target, fft, principal component analysis, eigenvector, octave-notes, dsp

Procedia PDF Downloads 320
4780 A Comparative Study of k-NN and MLP-NN Classifiers Using GA-kNN Based Feature Selection Method for Wood Recognition System

Authors: Uswah Khairuddin, Rubiyah Yusof, Nenny Ruthfalydia Rosli

Abstract:

This paper presents a comparative study between k-Nearest Neighbour (k-NN) and Multi-Layer Perceptron Neural Network (MLP-NN) classifier using Genetic Algorithm (GA) as feature selector for wood recognition system. The features have been extracted from the images using Grey Level Co-Occurrence Matrix (GLCM). The use of GA based feature selection is mainly to ensure that the database used for training the features for the wood species pattern classifier consists of only optimized features. The feature selection process is aimed at selecting only the most discriminating features of the wood species to reduce the confusion for the pattern classifier. This feature selection approach maintains the ‘good’ features that minimizes the inter-class distance and maximizes the intra-class distance. Wrapper GA is used with k-NN classifier as fitness evaluator (GA-kNN). The results shows that k-NN is the best choice of classifier because it uses a very simple distance calculation algorithm and classification tasks can be done in a short time with good classification accuracy.

Keywords: feature selection, genetic algorithm, optimization, wood recognition system

Procedia PDF Downloads 509
4779 Tracking Filtering Algorithm Based on ConvLSTM

Authors: Ailing Yang, Penghan Song, Aihua Cai

Abstract:

The nonlinear maneuvering target tracking problem is mainly a state estimation problem when the target motion model is uncertain. Traditional solutions include Kalman filtering based on Bayesian filtering framework and extended Kalman filtering. However, these methods need prior knowledge such as kinematics model and state system distribution, and their performance is poor in state estimation of nonprior complex dynamic systems. Therefore, in view of the problems existing in traditional algorithms, a convolution LSTM target state estimation (SAConvLSTM-SE) algorithm based on Self-Attention memory (SAM) is proposed to learn the historical motion state of the target and the error distribution information measured at the current time. The measured track point data of airborne radar are processed into data sets. After supervised training, the data-driven deep neural network based on SAConvLSTM can directly obtain the target state at the next moment. Through experiments on two different maneuvering targets, we find that the network has stronger robustness and better tracking accuracy than the existing tracking methods.

Keywords: maneuvering target, state estimation, Kalman filter, LSTM, self-attention

Procedia PDF Downloads 106
4778 Ranking of Inventory Policies Using Distance Based Approach Method

Authors: Gupta Amit, Kumar Ramesh, P. C. Tewari

Abstract:

Globalization is putting enormous pressure on the business organizations specially manufacturing one to rethink the supply chain in innovative manners. Inventory consumes major portion of total sale revenue. Effective and efficient inventory management plays a vital role for the successful functioning of any organization. Selection of inventory policy is one of the important purchasing activities. This paper focuses on selection and ranking of alternative inventory policies. A deterministic quantitative model-based on Distance Based Approach (DBA) method has been developed for evaluation and ranking of inventory policies. We have employed this concept first time for this type of the selection problem. Four inventory policies Economic Order Quantity (EOQ), Just in Time (JIT), Vendor Managed Inventory (VMI) and monthly policy are considered. Improper selection could affect a company’s competitiveness in terms of the productivity of its facilities and quality of its products. The ranking of inventory policies is a multi-criteria problem. There is a need to first identify the selection criteria and then processes the information with reference to relative importance of attributes for comparison. Criteria values for each inventory policy can be obtained either analytically or by using a simulation technique or they are linguistic subjective judgments defined by fuzzy sets, like, for example, the values of criteria. A methodology is developed and applied to rank the inventory policies.

Keywords: inventory policy, ranking, DBA, selection criteria

Procedia PDF Downloads 352
4777 Synthesis and Characterization of Anti-Psychotic Drugs Based DNA Aptamers

Authors: Shringika Soni, Utkarsh Jain, Nidhi Chauhan

Abstract:

Aptamers are recently discovered ~80-100 bp long artificial oligonucleotides that not only demonstrated their applications in therapeutics; it is tremendously used in diagnostic and sensing application to detect different biomarkers and drugs. Synthesizing aptamers for proteins or genomic template is comparatively feasible in laboratory, but drugs or other chemical target based aptamers require major specification and proper optimization and validation. One has to optimize all selection, amplification, and characterization steps of the end product, which is extremely time-consuming. Therefore, we performed asymmetric PCR (polymerase chain reaction) for random oligonucleotides pool synthesis, and further use them in Systematic evolution of ligands by exponential enrichment (SELEX) for anti-psychotic drugs based aptamers synthesis. Anti-psychotic drugs are major tranquilizers to control psychosis for proper cognitive functions. Though their low medical use, their misuse may lead to severe medical condition as addiction and can promote crime in social and economical impact. In this work, we have approached the in-vitro SELEX method for ssDNA synthesis for anti-psychotic drugs (in this case ‘target’) based aptamer synthesis. The study was performed in three stages, where first stage included synthesis of random oligonucleotides pool via asymmetric PCR where end product was analyzed with electrophoresis and purified for further stages. The purified oligonucleotide pool was incubated in SELEX buffer, and further partition was performed in the next stage to obtain target specific aptamers. The isolated oligonucleotides are characterized and quantified after each round of partition, and significant results were obtained. After the repetitive partition and amplification steps of target-specific oligonucleotides, final stage included sequencing of end product. We can confirm the specific sequence for anti-psychoactive drugs, which will be further used in diagnostic application in clinical and forensic set-up.

Keywords: anti-psychotic drugs, aptamer, biosensor, ssDNA, SELEX

Procedia PDF Downloads 108