Search results for: fuzzy set methods
14911 Implant Guided Surgery and Immediate Loading
Authors: Omid Tavakol, Mahnaz Gholami
Abstract:
Introduction : In this oral presentation the main goal is discussing immediate loading in dental implants , from treatment planning and surgical guide designing to delivery , follow up and occlusal consideration . Methods and materials : first of all systematic reviews about immediate loading will be considered . besides , a comparison will be made between immediate loading and conventional loading in terms of success rate and complications . After that different methods , prosthetic options and materials best used in immediate loading will be explained. Particularly multi unit abutments and their mechanism of function will be explained .Digital impressions and designing the temporaries is the next topic we are to explicate .Next issue is the differences between single unit , multiple unit and full arch implantation in immediate loading .Following we are going to describe methods for tissue engineering and papilla formation after extraction . Last slides are about a full mouth rehabilitation via immediate loading technique from surgical designing to follow up .At the end we would talk about potential complications , how to prevent from occurrence and what to do if we face up with .Keywords: guided surgery, digital implantology, immediate loading, digital dentistry
Procedia PDF Downloads 4414910 Assessment of the Electrical, Mechanical, and Thermal Nociceptive Thresholds for Stimulation and Pain Measurements at the Bovine Hind Limb
Authors: Samaneh Yavari, Christiane Pferrer, Elisabeth Engelke, Alexander Starke, Juergen Rehage
Abstract:
Background: Three nociceptive thresholds of thermal, electrical, and mechanical thresholds commonly use to evaluate the local anesthesia in many species, for instance, cow, horse, cat, dog, rabbit, and so on. Due to the lack of investigations to evaluate and/or validate such those nociceptive thresholds, our plan was the comparison of two-foot local anesthesia methods of Intravenous Regional Anesthesia (IVRA) and our modified four-point Nerve Block Anesthesia (NBA). Materials and Methods: Eight healthy nonpregnant nondairy Holstein Frisian cows in a cross-over study design were selected for this study. All cows divided into two different groups to receive two local anesthesia techniques of IVRA and our modified four-point NBA. Three thermal, electrical, and mechanical force and pinpricks were applied to evaluate the quality of local anesthesia methods before and after local anesthesia application. Results: The statistical evaluation demonstrated that our four-point NBA has a qualification to select as a standard foot local anesthesia. However, the recorded results of our study revealed no significant difference between two groups of local anesthesia techniques of IVRA and modified four-point NBA related to quality and duration of anesthesia stimulated by electrical, mechanical and thermal nociceptive stimuli. Conclusion and discussion: All three nociceptive threshold stimuli of electrical, mechanical and heat nociceptive thresholds can be applied to measure and evaluate the efficacy of foot local anesthesia of dairy cows. However, our study revealed no superiority of those three nociceptive methods to evaluate the duration and quality of bovine foot local anesthesia methods. Veterinarians to investigate the duration and quality of their selected anesthesia method can use any of those heat, mechanical, and electrical methods.Keywords: mechanical, thermal, electrical threshold, IVRA, NBA, hind limb, dairy cow
Procedia PDF Downloads 24514909 Predicting Consolidation Coefficient of Busan Clay by Time-Displacement-Velocity Methods
Authors: Thang Minh Le, Hadi Khabbaz
Abstract:
The coefficient of consolidation is a parameter governing the rate at which saturated soil particularly clay undergoes consolidation when subjected to an increase in pressure. The rate and amount of compression in soil varies with the rate that pore water is lost; and hence depends on soil permeability. Over many years, various methods have been proposed to determine the coefficient of consolidation, cv, which is an indication of the rate of foundation settlement on soft ground. However, defining this parameter is often problematic and heavily relies on graphical techniques, which are subject to some uncertainties. This paper initially presents an overview of many well-established methods to determine the vertical coefficient of consolidation from the incremental loading consolidation tests. An array of consolidation tests was conducted on the undisturbed clay samples, collected at various depths from a site in Nakdong river delta, Busan, South Korea. The consolidation test results on these soft sensitive clay samples were employed to evaluate the targeted methods to predict the settlement rate of Busan clay. In relationship of time-displacement-velocity, a total of 3 method groups from 10 common procedures were classified and compared together. Discussions on study results will be also provided.Keywords: Busan clay, coefficient of consolidation, constant rate of strain, incremental loading
Procedia PDF Downloads 18614908 Analysis of ECGs Survey Data by Applying Clustering Algorithm
Authors: Irum Matloob, Shoab Ahmad Khan, Fahim Arif
Abstract:
As Indo-pak has been the victim of heart diseases since many decades. Many surveys showed that percentage of cardiac patients is increasing in Pakistan day by day, and special attention is needed to pay on this issue. The framework is proposed for performing detailed analysis of ECG survey data which is conducted for measuring the prevalence of heart diseases statistics in Pakistan. The ECG survey data is evaluated or filtered by using automated Minnesota codes and only those ECGs are used for further analysis which is fulfilling the standardized conditions mentioned in the Minnesota codes. Then feature selection is performed by applying proposed algorithm based on discernibility matrix, for selecting relevant features from the database. Clustering is performed for exposing natural clusters from the ECG survey data by applying spectral clustering algorithm using fuzzy c means algorithm. The hidden patterns and interesting relationships which have been exposed after this analysis are useful for further detailed analysis and for many other multiple purposes.Keywords: arrhythmias, centroids, ECG, clustering, discernibility matrix
Procedia PDF Downloads 35114907 A Comparison between Russian and Western Approach for Deep Foundation Design
Authors: Saeed Delara, Kendra MacKay
Abstract:
Varying methodologies are considered for pile design for both Russian and Western approaches. Although both approaches rely on toe and side frictional resistances, different calculation methods are proposed to estimate pile capacity. The Western approach relies on compactness (internal friction angle) of soil for cohesionless soils and undrained shear strength for cohesive soils. The Russian approach relies on grain size for cohesionless soils and liquidity index for cohesive soils. Though most recommended methods in the Western approaches are relatively simple methods to predict pile settlement, the Russian approach provides a detailed method to estimate single pile and pile group settlement. Details to calculate pile axial capacity and settlement using the Russian and Western approaches are discussed and compared against field test results.Keywords: pile capacity, pile settlement, Russian approach, western approach
Procedia PDF Downloads 16614906 Removal of Hexavalent Chromium from Aqueous Solutions by Biosorption Using Macadamia Nutshells: Effect of Different Treatment Methods
Authors: Vusumzi E. Pakade, Themba D. Ntuli, Augustine E. Ofomaja
Abstract:
Macadamia nutshell biosorbents treated in three different methods (raw Macadamia nutshell powder (RMN), acid-treated Macadamia nutshell (ATMN) and base-treated Macadamia nutshell (BTMN)) were investigated for the adsorption of Cr(VI) from aqueous solutions. Fourier transform infrared spectroscopy (FT-IR) spectra of free and Cr(VI)-loaded sorbents as well as thermogravimetric analysis (TGA) revealed that the acid and base treatments modified the surface properties of the sorbents. The optimum conditions for the adsorption of Cr(VI) by sorbents were pH 2, contact time 10 h, adsorbent dosage 0.2 g L-1, and concentration 100 mg L-1. The different treatment methods altered the surface characteristics of the sorbents and produced different maximum binding capacities of 42.5, 40.6 and 37.5 mg g-1 for RMN, ATMN and BTMN, respectively. The data was fitted into the Langmuir, Freundlich, Redlich-Peterson and Sips isotherms. No single model could clearly explain the data perhaps due to the complexity of process taking place. The kinetic modeling results showed that the process of Cr(VI) biosorption with Macadamia sorbents was better described by a process of chemical sorption in pseudo-second order. These results showed that the three treatment methods yielded different surface properties which then influenced adsorption of Cr(VI) differently.Keywords: biosorption, chromium(VI), isotherms, Macadamia, reduction, treatment
Procedia PDF Downloads 26614905 Control of a Quadcopter Using Genetic Algorithm Methods
Authors: Mostafa Mjahed
Abstract:
This paper concerns the control of a nonlinear system using two different methods, reference model and genetic algorithm. The quadcopter is a nonlinear unstable system, which is a part of aerial robots. It is constituted by four rotors placed at the end of a cross. The center of this cross is occupied by the control circuit. Its motions are governed by six degrees of freedom: three rotations around 3 axes (roll, pitch and yaw) and the three spatial translations. The control of such system is complex, because of nonlinearity of its dynamic representation and the number of parameters, which it involves. Numerous studies have been developed to model and stabilize such systems. The classical PID and LQ correction methods are widely used. If the latter represent the advantage to be simple because they are linear, they reveal the drawback to require the presence of a linear model to synthesize. It also implies the complexity of the established laws of command because the latter must be widened on all the domain of flight of these quadcopter. Note that, if the classical design methods are widely used to control aeronautical systems, the Artificial Intelligence methods as genetic algorithms technique receives little attention. In this paper, we suggest comparing two PID design methods. Firstly, the parameters of the PID are calculated according to the reference model. In a second phase, these parameters are established using genetic algorithms. By reference model, we mean that the corrected system behaves according to a reference system, imposed by some specifications: settling time, zero overshoot etc. Inspired from the natural evolution of Darwin's theory advocating the survival of the best, John Holland developed this evolutionary algorithm. Genetic algorithm (GA) possesses three basic operators: selection, crossover and mutation. We start iterations with an initial population. Each member of this population is evaluated through a fitness function. Our purpose is to correct the behavior of the quadcopter around three axes (roll, pitch and yaw) with 3 PD controllers. For the altitude, we adopt a PID controller.Keywords: quadcopter, genetic algorithm, PID, fitness, model, control, nonlinear system
Procedia PDF Downloads 43114904 Exploring the Challenges to Usage of Building Construction Cost Indices in Ghana
Authors: Jerry Gyimah, Ernest Kissi, Safowaa Osei-Tutu, Charles Dela Adobor, Theophilus Adjei-Kumi, Ernest Osei-Tutu
Abstract:
Price fluctuation contract is imperative and of paramount essence, in the construction industry as it provides adequate relief and cushioning for changes in the prices of input resources during construction. As a result, several methods have been devised to better help in arriving at fair recompense in the event of price changes. However, stakeholders often appear not to be satisfied with the existing methods of fluctuation evaluation, ostensibly because of the challenges associated with them. The aim of this study was to identify the challenges to the usage of building construction cost indices in Ghana. Data was gathered from contractors and quantity surveying firms. The study utilized a survey questionnaire approach to elicit responses from the contractors and the consultants. Data gathered was analyzed scientifically, using the relative importance index (RII) to rank the problems associated with the existing methods. The findings revealed the following, among others, late release of data, inadequate recovery of costs, and work items of interest not included in the published indices as the main challenges of the existing methods. Findings provide useful lessons for policymakers and practitioners in decision making towards the usage and improvement of available indices.Keywords: building construction cost indices, challenges, usage, Ghana
Procedia PDF Downloads 15214903 Credit Risk Assessment Using Rule Based Classifiers: A Comparative Study
Authors: Salima Smiti, Ines Gasmi, Makram Soui
Abstract:
Credit risk is the most important issue for financial institutions. Its assessment becomes an important task used to predict defaulter customers and classify customers as good or bad payers. To this objective, numerous techniques have been applied for credit risk assessment. However, to our knowledge, several evaluation techniques are black-box models such as neural networks, SVM, etc. They generate applicants’ classes without any explanation. In this paper, we propose to assess credit risk using rules classification method. Our output is a set of rules which describe and explain the decision. To this end, we will compare seven classification algorithms (JRip, Decision Table, OneR, ZeroR, Fuzzy Rule, PART and Genetic programming (GP)) where the goal is to find the best rules satisfying many criteria: accuracy, sensitivity, and specificity. The obtained results confirm the efficiency of the GP algorithm for German and Australian datasets compared to other rule-based techniques to predict the credit risk.Keywords: credit risk assessment, classification algorithms, data mining, rule extraction
Procedia PDF Downloads 18114902 Feature Evaluation Based on Random Subspace and Multiple-K Ensemble
Authors: Jaehong Yu, Seoung Bum Kim
Abstract:
Clustering analysis can facilitate the extraction of intrinsic patterns in a dataset and reveal its natural groupings without requiring class information. For effective clustering analysis in high dimensional datasets, unsupervised dimensionality reduction is an important task. Unsupervised dimensionality reduction can generally be achieved by feature extraction or feature selection. In many situations, feature selection methods are more appropriate than feature extraction methods because of their clear interpretation with respect to the original features. The unsupervised feature selection can be categorized as feature subset selection and feature ranking method, and we focused on unsupervised feature ranking methods which evaluate the features based on their importance scores. Recently, several unsupervised feature ranking methods were developed based on ensemble approaches to achieve their higher accuracy and stability. However, most of the ensemble-based feature ranking methods require the true number of clusters. Furthermore, these algorithms evaluate the feature importance depending on the ensemble clustering solution, and they produce undesirable evaluation results if the clustering solutions are inaccurate. To address these limitations, we proposed an ensemble-based feature ranking method with random subspace and multiple-k ensemble (FRRM). The proposed FRRM algorithm evaluates the importance of each feature with the random subspace ensemble, and all evaluation results are combined with the ensemble importance scores. Moreover, FRRM does not require the determination of the true number of clusters in advance through the use of the multiple-k ensemble idea. Experiments on various benchmark datasets were conducted to examine the properties of the proposed FRRM algorithm and to compare its performance with that of existing feature ranking methods. The experimental results demonstrated that the proposed FRRM outperformed the competitors.Keywords: clustering analysis, multiple-k ensemble, random subspace-based feature evaluation, unsupervised feature ranking
Procedia PDF Downloads 33914901 Dialogue Meetings as an Arena for Collaboration and Reflection among Researchers and Practitioners
Authors: Kerstin Grunden, Ann Svensson, Berit Forsman, Christina Karlsson, Ayman Obeid
Abstract:
The research question of the article is to explore whether the dialogue meetings method could be relevant for reflective learning among researchers and practitioners when welfare technology should be implemented in municipalities, or not. A testbed was planned to be implemented in a retirement home in a Swedish municipality, and the practitioners worked with a pre-study of that testbed. In the article, the dialogue between the researchers and the practitioners in the dialogue meetings is described and analyzed. The potential of dialogue meetings as an arena for learning and reflection among researchers and practitioners is discussed. The research methodology approach is participatory action research with mixed methods (dialogue meetings, focus groups, participant observations). The main findings from the dialogue meetings were that the researchers learned more about the use of traditional research methods, and the practitioners learned more about how they could improve their use of the methods to facilitate change processes in their organization. These findings have the potential both for the researchers and the practitioners to result in more relevant use of research methods in change processes in organizations. It is concluded that dialogue meetings could be relevant for reflective learning among researchers and practitioners when welfare technology should be implemented in a health care organization.Keywords: dialogue meetings, implementation, reflection, test bed, welfare technology, participatory action research
Procedia PDF Downloads 14614900 Protein Remote Homology Detection and Fold Recognition by Combining Profiles with Kernel Methods
Authors: Bin Liu
Abstract:
Protein remote homology detection and fold recognition are two most important tasks in protein sequence analysis, which is critical for protein structure and function studies. In this study, we combined the profile-based features with various string kernels, and constructed several computational predictors for protein remote homology detection and fold recognition. Experimental results on two widely used benchmark datasets showed that these methods outperformed the competing methods, indicating that these predictors are useful computational tools for protein sequence analysis. By analyzing the discriminative features of the training models, some interesting patterns were discovered, reflecting the characteristics of protein superfamilies and folds, which are important for the researchers who are interested in finding the patterns of protein folds.Keywords: protein remote homology detection, protein fold recognition, profile-based features, Support Vector Machines (SVMs)
Procedia PDF Downloads 16114899 Developing a Viral Artifact to Improve Employees’ Security Behavior
Authors: Stefan Bauer, Josef Frysak
Abstract:
According to the scientific information management literature, the improper use of information technology (e.g. personal computers) by employees are one main cause for operational and information security loss events. Therefore, organizations implement information security awareness programs to increase employees’ awareness to further prevention of loss events. However, in many cases these information security awareness programs consist of conventional delivery methods like posters, leaflets, or internal messages to make employees aware of information security policies. We assume that a viral information security awareness video might be more effective medium than conventional methods commonly used by organizations. The purpose of this research is to develop a viral video artifact to improve employee security behavior concerning information technology.Keywords: information security awareness, delivery methods, viral videos, employee security behavior
Procedia PDF Downloads 54214898 Effectiveness of Online Language Learning
Authors: Shazi Shah Jabeen, Ajay Jesse Thomas
Abstract:
The study is aimed at understanding the learning trends of students who opt for online language courses and to assess the effectiveness of the same. Multiple factors including use of the latest available technology and the skills that are trained by these online methods have been assessed. An attempt has been made to answer how each of the various language skills is trained online and how effective the online methods are compared to the classroom methods when students interact with peers and instructor. A mixed method research design was followed for collecting information for the study where a survey by means of a questionnaire and in-depth interviews with a number of respondents were undertaken across the various institutes and study centers located in the United Arab Emirates. The questionnaire contained 19 questions which included 7 sub-questions. The study revealed that the students find learning with an instructor to be a lot more effective than learning alone in an online environment. They prefer classroom environment more than the online setting for language learning.Keywords: effectiveness, language, online learning, skills
Procedia PDF Downloads 58914897 Comparison of Methods of Estimation for Use in Goodness of Fit Tests for Binary Multilevel Models
Authors: I. V. Pinto, M. R. Sooriyarachchi
Abstract:
It can be frequently observed that the data arising in our environment have a hierarchical or a nested structure attached with the data. Multilevel modelling is a modern approach to handle this kind of data. When multilevel modelling is combined with a binary response, the estimation methods get complex in nature and the usual techniques are derived from quasi-likelihood method. The estimation methods which are compared in this study are, marginal quasi-likelihood (order 1 & order 2) (MQL1, MQL2) and penalized quasi-likelihood (order 1 & order 2) (PQL1, PQL2). A statistical model is of no use if it does not reflect the given dataset. Therefore, checking the adequacy of the fitted model through a goodness-of-fit (GOF) test is an essential stage in any modelling procedure. However, prior to usage, it is also equally important to confirm that the GOF test performs well and is suitable for the given model. This study assesses the suitability of the GOF test developed for binary response multilevel models with respect to the method used in model estimation. An extensive set of simulations was conducted using MLwiN (v 2.19) with varying number of clusters, cluster sizes and intra cluster correlations. The test maintained the desirable Type-I error for models estimated using PQL2 and it failed for almost all the combinations of MQL. Power of the test was adequate for most of the combinations in all estimation methods except MQL1. Moreover, models were fitted using the four methods to a real-life dataset and performance of the test was compared for each model.Keywords: goodness-of-fit test, marginal quasi-likelihood, multilevel modelling, penalized quasi-likelihood, power, quasi-likelihood, type-I error
Procedia PDF Downloads 14214896 The Appraisal of Construction Sites Productivity: In Kendall’s Concordance
Authors: Abdulkadir Abu Lawal
Abstract:
For the dearth of reliable cardinal numerical data, the linked phenomena in productivity indices such as operational costs and company turnovers, etc. could not be investigated. This would not give us insight to the root of productivity problems at unique sites. So, ordinal ranking by professionals who were most directly involved with construction sites was applied for Kendall’s concordance. Responses gathered from independent architects, builders/engineers, and quantity surveyors were herein analyzed. They were responses based on factors that affect sites productivity, and these factors were categorized as head office factors, resource management effectiveness factors, motivational factors, and training/skill development factors. It was found that productivity is low and has to be improved in order to facilitate Nigerian efforts in bridging its infrastructure deficit. The significance of this work is underlined with the Kendall’s coefficient of concordance of 0.78, while remedial measures must be emphasized to stimulate better productivity. Further detailed study can be undertaken by using Fuzzy logic analysis on wider Delphi survey.Keywords: factors, Kendall's coefficient of concordance, magnitude of agreement, percentage magnitude of dichotomy, ranking variables
Procedia PDF Downloads 62714895 A Comparison Study of Different Methods Used in the Detection of Giardia lamblia on Fecal Specimen of Children
Authors: Muhammad Farooq Baig
Abstract:
Objective: The purpose of this study was to compare results obtained using a single fecal specimen for O&P examination, direct immunofluorescence assay (DFA), and two conventional staining methods. Design: Hundred and fifty children fecal specimens were collected and examined by each method. The O&P and the DFA were used as the reference method. Setting: The study was performed at the laboratory in the Basic Medical Science Institute JPMC Karachi. Patients or Other Participants: The fecal specimens were collected from children with a suspected Giardia lamblia infection. Main Outcome Measures: The amount of agreement and disagreement between methods.1) Presence of giardiasis in our population. 2) The sensitivity and specificity of each method. Results: There was 45(30%) positive 105 (70%) negative on DFA, 41 (27.4%) positive 109 (72.6%) negative on iodine and 34 (22.6%) positive 116(77.4%) on saline method. The sensitivity and specificity of DFA in comparision to iodine were 92.2%, 92.7% respectively. The sensitivity and specificity of DFA in comparisoin to saline method were 91.2%, 87.9% respectively. The sensitivity of iodine method and saline method in compariosn to DFA were 82.2%, 68.8% respectively. There is mark diffrence in sensitivity of DFA to conventional method. Conclusion: The study supported findings of other investigators who concluded that DFA method have the greater sensitivity. The immunologic methods were more efficient and quicker than the conventional O&P method.Keywords: direct immunofluorescence assay (DFA), ova and parasite (O&P), Giardia lamblia, children, medical science
Procedia PDF Downloads 42314894 Evaluation of Simple, Effective and Affordable Processing Methods to Reduce Phytates in the Legume Seeds Used for Feed Formulations
Authors: N. A. Masevhe, M. Nemukula, S. S. Gololo, K. G. Kgosana
Abstract:
Background and Study Significance: Legume seeds are important in agriculture as they are used for feed formulations due to their nutrient-dense, low-cost, and easy accessibility. Although they are important sources of energy, proteins, carbohydrates, vitamins, and minerals, they contain abundant quantities of anti-nutritive factors that reduce the bioavailability of nutrients, digestibility of proteins, and mineral absorption in livestock. However, the removal of these factors is too costly as it requires expensive state-of-the-art techniques such as high pressure and thermal processing. Basic Methodologies: The aim of the study was to investigate cost-effective methods that can be used to reduce the inherent phytates as putative antinutrients in the legume seeds. The seeds of Arachis hypogaea, Pisum sativum and Vigna radiata L. were subjected to the single processing methods viz raw seeds plus dehulling (R+D), soaking plus dehulling (S+D), ordinary cooking plus dehulling (C+D), infusion plus dehulling (I+D), autoclave plus dehulling (A+D), microwave plus dehulling (M+D) and five combined methods (S+I+D; S+A+D; I+M+D; S+C+D; S+M+D). All the processed seeds were dried, ground into powder, extracted, and analyzed on a microplate reader to determine the percentage of phytates per dry mass of the legume seeds. Phytic acid was used as a positive control, and one-way ANOVA was used to determine the significant differences between the means of the processing methods at a threshold of 0.05. Major Findings: The results of the processing methods showed the percentage yield ranges of 39.1-96%, 67.4-88.8%, and 70.2-93.8% for V. radiata, A. hypogaea and P. sativum, respectively. Though the raw seeds contained the highest contents of phytates that ranged between 0.508 and 0.527%, as expected, the R+D resulted in a slightly lower phytate percentage range of 0.469-0.485%, while other processing methods resulted in phytate contents that were below 0.35%. The M+D and S+M+D methods showed low phytate percentage ranges of 0.276-0.296% and 0.272-0.294%, respectively, where the lowest percentage yield was determined in S+M+D of P. sativum. Furthermore, these results were found to be significantly different (p<0.05). Though phytates cause micronutrient deficits as they chelate important minerals such as calcium, zinc, iron, and magnesium, their reduction may enhance nutrient bioavailability since they cannot be digested by the ruminants. Concluding Statement: Despite the nutritive aspects of the processed legume seeds, which are still in progress, the M+D and S+M+D methods, which significantly reduced the phytates in the investigated legume seeds, may be recommended to the local farmers and feed-producing industries so as to enhance animal health and production at an affordable cost.Keywords: anti-nutritive factors, extraction, legume seeds, phytate
Procedia PDF Downloads 2814893 Sensitivity Analysis for 14 Bus Systems in a Distribution Network with Distributed Generators
Authors: Lakshya Bhat, Anubhav Shrivastava, Shiva Rudraswamy
Abstract:
There has been a formidable interest in the area of Distributed Generation in recent times. A wide number of loads are addressed by Distributed Generators and have better efficiency too. The major disadvantage in Distributed Generation is voltage control- is highlighted in this paper. The paper addresses voltage control at buses in IEEE 14 Bus system by regulating reactive power. An analysis is carried out by selecting the most optimum location in placing the Distributed Generators through load flow analysis and seeing where the voltage profile rises. MATLAB programming is used for simulation of voltage profile in the respective buses after introduction of DG’s. A tolerance limit of +/-5% of the base value has to be maintained. To maintain the tolerance limit, 3 methods are used. Sensitivity analysis of 3 methods for voltage control is carried out to determine the priority among the methods.Keywords: distributed generators, distributed system, reactive power, voltage control, sensitivity analysis
Procedia PDF Downloads 70214892 The Effects of Extraction Methods on Fat Content and Fatty Acid Profiles of Marine Fish Species
Authors: Yesim Özogul, Fethiye Takadaş, Mustafa Durmus, Yılmaz Ucar, Ali Rıza Köşker, Gulsun Özyurt, Fatih Özogul
Abstract:
It has been well documented that polyunsaturated fatty acids (PUFAs), especially eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) have beneficial effects on health, regarding prevention of cardiovascular diseases, cancer and autoimmune disorders, development the brain and retina and treatment of major depressive disorder etc. Thus, an adequate intake of omega PUFA is essential and generally marine fish are the richest sources of PUFA in human diet. Thus, this study was conducted to evaluate the efficiency of different extraction methods (Bligh and Dyer, soxhlet, microwave and ultrasonics) on the fat content and fatty acid profiles of marine fish species (Mullus babatus, Upeneus moluccensis, Mullus surmuletus, Anguilla anguilla, Pagellus erythrinus and Saurida undosquamis). Fish species were caught by trawl in Mediterranean Sea and immediately iced. After that, fish were transported to laboratory in ice and stored at -18oC in a freezer until the day of analyses. After extracting lipid from fish by different methods, lipid samples were converted to their constituent fatty acid methyl esters. The fatty acid composition was analysed by a GC Clarus 500 with an autosampler (Perkin Elmer, Shelton, CT, USA) equipped with a flame ionization detector and a fused silica capillary SGE column (30 m x 0.32 mm ID x 0.25 mm BP20 0.25 UM, USA). The results showed that there were significant differences (P < 0.05) in fatty acids of all species and also extraction methods affected fat contents and fatty acid profiles of fish species.Keywords: extraction methods, fatty acids, marine fish, PUFA
Procedia PDF Downloads 26714891 Formalizing a Procedure for Generating Uncertain Resource Availability Assumptions Based on Real Time Logistic Data Capturing with Auto-ID Systems for Reactive Scheduling
Authors: Lars Laußat, Manfred Helmus, Kamil Szczesny, Markus König
Abstract:
As one result of the project “Reactive Construction Project Scheduling using Real Time Construction Logistic Data and Simulation”, a procedure for using data about uncertain resource availability assumptions in reactive scheduling processes has been developed. Prediction data about resource availability is generated in a formalized way using real-time monitoring data e.g. from auto-ID systems on the construction site and in the supply chains. The paper focuses on the formalization of the procedure for monitoring construction logistic processes, for the detection of disturbance and for generating of new and uncertain scheduling assumptions for the reactive resource constrained simulation procedure that is and will be further described in other papers.Keywords: auto-ID, construction logistic, fuzzy, monitoring, RFID, scheduling
Procedia PDF Downloads 51314890 Simple Procedure for Probability Calculation of Tensile Crack Occurring in Rigid Pavement: A Case Study
Authors: Aleš Florian, Lenka Ševelová, Jaroslav Žák
Abstract:
Formation of tensile cracks in concrete slabs of rigid pavement can be (among others) the initiation point of the other, more serious failures which can ultimately lead to complete degradation of the concrete slab and thus the whole pavement. Two measures can be used for reliability assessment of this phenomenon - the probability of failure and/or the reliability index. Different methods can be used for their calculation. The simple ones are called moment methods and simulation techniques. Two methods - FOSM Method and Simple Random Sampling Method - are verified and their comparison is performed. The influence of information about the probability distribution and the statistical parameters of input variables as well as of the limit state function on the calculated reliability index and failure probability are studied in three points on the lower surface of concrete slabs of the older type of rigid pavement formerly used in the Czech Republic.Keywords: failure, pavement, probability, reliability index, simulation, tensile crack
Procedia PDF Downloads 54614889 A Two-Stage Bayesian Variable Selection Method with the Extension of Lasso for Geo-Referenced Data
Authors: Georgiana Onicescu, Yuqian Shen
Abstract:
Due to the complex nature of geo-referenced data, multicollinearity of the risk factors in public health spatial studies is a commonly encountered issue, which leads to low parameter estimation accuracy because it inflates the variance in the regression analysis. To address this issue, we proposed a two-stage variable selection method by extending the least absolute shrinkage and selection operator (Lasso) to the Bayesian spatial setting, investigating the impact of risk factors to health outcomes. Specifically, in stage I, we performed the variable selection using Bayesian Lasso and several other variable selection approaches. Then, in stage II, we performed the model selection with only the selected variables from stage I and compared again the methods. To evaluate the performance of the two-stage variable selection methods, we conducted a simulation study with different distributions for the risk factors, using geo-referenced count data as the outcome and Michigan as the research region. We considered the cases when all candidate risk factors are independently normally distributed, or follow a multivariate normal distribution with different correlation levels. Two other Bayesian variable selection methods, Binary indicator, and the combination of Binary indicator and Lasso were considered and compared as alternative methods. The simulation results indicated that the proposed two-stage Bayesian Lasso variable selection method has the best performance for both independent and dependent cases considered. When compared with the one-stage approach, and the other two alternative methods, the two-stage Bayesian Lasso approach provides the highest estimation accuracy in all scenarios considered.Keywords: Lasso, Bayesian analysis, spatial analysis, variable selection
Procedia PDF Downloads 14314888 Risk Prioritization in Tunneling Construction Projects
Authors: David Nantes, George Gilbert
Abstract:
There are a lot of risks that might crop up as a tunneling project develops, and it's crucial to be aware of them. Due to the unexpected nature of tunneling projects and the interconnectedness of risk occurrences, the risk assessment approach presents a significant challenge. The purpose of this study is to provide a hybrid FDEMATEL-ANP model to help prioritize risks during tunnel construction projects. The ambiguity in expert judgments and the relative severity of interdependencies across risk occurrences are both taken into consideration by this model, thanks to the Fuzzy Decision-Making Trial and Evaluation Laboratory (FDEMATEL). The Analytic Network Process (ANP) method is used to rank priorities and assess project risks. The authors provide a case study of a subway tunneling construction project to back up the validity of their methodology. The results showed that the proposed method successfully isolated key risk factors and elucidated their interplay in the case study. The proposed method has the potential to become a helpful resource for evaluating dangers associated with tunnel construction projects.Keywords: risk, prioritization, FDEMATEL, ANP, tunneling construction projects
Procedia PDF Downloads 9214887 A Quantitative Evaluation of Text Feature Selection Methods
Authors: B. S. Harish, M. B. Revanasiddappa
Abstract:
Due to rapid growth of text documents in digital form, automated text classification has become an important research in the last two decades. The major challenge of text document representations are high dimension, sparsity, volume and semantics. Since the terms are only features that can be found in documents, selection of good terms (features) plays an very important role. In text classification, feature selection is a strategy that can be used to improve classification effectiveness, computational efficiency and accuracy. In this paper, we present a quantitative analysis of most widely used feature selection (FS) methods, viz. Term Frequency-Inverse Document Frequency (tfidf ), Mutual Information (MI), Information Gain (IG), CHISquare (x2), Term Frequency-Relevance Frequency (tfrf ), Term Strength (TS), Ambiguity Measure (AM) and Symbolic Feature Selection (SFS) to classify text documents. We evaluated all the feature selection methods on standard datasets like 20 Newsgroups, 4 University dataset and Reuters-21578.Keywords: classifiers, feature selection, text classification
Procedia PDF Downloads 45814886 Development of ELF Passive Shielding Application Using Magnetic Aqueous Substrate
Authors: W. N. L. Mahadi, S. N. Syed Zin, W. A. R. Othman, N. A. Mohd Rasyid, N. Jusoh
Abstract:
Public concerns on Extremely Low Frequency (ELF) Electromagnetic Field (EMF) exposure have been elongated since the last few decades. Electrical substations and high tension rooms (HT room) in commercial buildings were among the contributing factors emanating ELF magnetic fields. This paper discussed various shielding methods conventionally used in mitigating the ELF exposure. Nevertheless, the standard methods were found to be impractical and incapable of meeting currents shielding demands. In response to that, remarkable researches were conducted in effort to invent novel methods which is more convenient and efficient such as magnetic aqueous shielding or paint, textiles and papers shielding. A mitigation method using magnetic aqueous substrate in shielding application was proposed in this paper for further investigation. using Manganese Zinc Ferrite (Mn0.4Zn0.6Fe2O4). The magnetic field and flux distribution inside the aqueous magnetic material are evaluated to optimize shielding against ELF-EMF exposure, as to mitigate its exposure.Keywords: ELF shielding, magnetic aqueous substrate, shielding effectiveness, passive shielding, magnetic material
Procedia PDF Downloads 53114885 A Comparative Analysis of Traditional and Advanced Methods in Evaluating Anti-corrosion Performance of Sacrificial and Barrier Coatings
Authors: Kazem Sabet-Bokati, Ilia Rodionov, Marciel Gaier, Kevin Plucknett
Abstract:
Protective coatings play a pivotal role in mitigating corrosion and preserving the integrity of metallic structures exposed to harsh environmental conditions. The diversity of corrosive environments necessitates the development of protective coatings suitable for various conditions. Accurately selecting and interpreting analysis methods is crucial in identifying the most suitable protective coatings for the various corrosive environments. This study conducted a comprehensive comparative analysis of traditional and advanced methods to assess the anti-corrosion performance of sacrificial and barrier coatings. The protective performance of pure epoxy, zinc-rich epoxy, and cold galvanizing coatings was evaluated using salt spray tests, together with electrochemical impedance spectroscopy (EIS) and potentiodynamic polarization methods. The performance of each coating was thoroughly differentiated under both atmospheric and immersion conditions. The distinct protective performance of each coating against atmospheric corrosion was assessed using traditional standard methods. Additionally, the electrochemical responses of these coatings in immersion conditions were systematically studied, and a detailed discussion on interpreting the electrochemical responses is provided. Zinc-rich epoxy and cold galvanizing coatings offer superior anti-corrosion performance against atmospheric corrosion, while the pure epoxy coating excels in immersion conditions.Keywords: corrosion, barrier coatings, sacrificial coatings, salt-spray, EIS, polarization
Procedia PDF Downloads 6514884 Theoretical Comparisons and Empirical Illustration of Malmquist, Hicks–Moorsteen, and Luenberger Productivity Indices
Authors: Fatemeh Abbasi, Sahand Daneshvar
Abstract:
Productivity is one of the essential goals of companies to improve performance, which as a strategy-oriented method, determines the basis of the company's economic growth. The history of productivity goes back centuries, but most researchers defined productivity as the relationship between a product and the factors used in production in the early twentieth century. Productivity as the optimal use of available resources means that "more output using less input" can increase companies' economic growth and prosperity capacity. Also, having a quality life based on economic progress depends on productivity growth in that society. Therefore, productivity is a national priority for any developed country. There are several methods for calculating productivity growth measurements that can be divided into parametric and non-parametric methods. Parametric methods rely on the existence of a function in their hypotheses, while non-parametric methods do not require a function based on empirical evidence. One of the most popular non-parametric methods is Data Envelopment Analysis (DEA), which measures changes in productivity over time. The DEA evaluates the productivity of decision-making units (DMUs) based on mathematical models. This method uses multiple inputs and outputs to compare the productivity of similar DMUs such as banks, government agencies, companies, airports, Etc. Non-parametric methods are themselves divided into the frontier and non frontier approaches. The Malmquist productivity index (MPI) proposed by Caves, Christensen, and Diewert (1982), the Hicks–Moorsteen productivity index (HMPI) proposed by Bjurek (1996), or the Luenberger productivity indicator (LPI) proposed by Chambers (2002) are powerful tools for measuring productivity changes over time. This study will compare the Malmquist, Hicks–Moorsteen, and Luenberger indices theoretically and empirically based on DEA models and review their strengths and weaknesses.Keywords: data envelopment analysis, Hicks–Moorsteen productivity index, Leuenberger productivity indicator, malmquist productivity index
Procedia PDF Downloads 19414883 Sensitivity Analysis for 14 Bus Systems in a Distribution Network with Distribution Generators
Authors: Lakshya Bhat, Anubhav Shrivastava, Shivarudraswamy
Abstract:
There has been a formidable interest in the area of Distributed Generation in recent times. A wide number of loads are addressed by Distributed Generators and have better efficiency too. The major disadvantage in Distributed Generation is voltage control- is highlighted in this paper. The paper addresses voltage control at buses in IEEE 14 Bus system by regulating reactive power. An analysis is carried out by selecting the most optimum location in placing the Distributed Generators through load flow analysis and seeing where the voltage profile rises. Matlab programming is used for simulation of voltage profile in the respective buses after introduction of DG’s. A tolerance limit of +/-5% of the base value has to be maintained.To maintain the tolerance limit , 3 methods are used. Sensitivity analysis of 3 methods for voltage control is carried out to determine the priority among the methods.Keywords: distributed generators, distributed system, reactive power, voltage control, sensitivity analysis
Procedia PDF Downloads 58714882 Modeling and Analyzing Controversy in Large-Scale Cyber-Argumentation
Authors: Najla Althuniyan
Abstract:
Online discussions take place across different platforms. These discussions have the potential to extract crowd wisdom and capture the collective intelligence from a different perspective. However, certain phenomena, such as controversy, often appear in online argumentation that makes the discussion between participants heated. Heated discussions can be used to extract new knowledge. Therefore, detecting the presence of controversy is an essential task to determine if collective intelligence can be extracted from online discussions. This paper uses existing measures for estimating controversy quantitatively in cyber-argumentation. First, it defines controversy in different fields, and then it identifies the attributes of controversy in online discussions. The distributions of user opinions and the distance between opinions are used to calculate the controversial degree of a discussion. Finally, the results from each controversy measure are discussed and analyzed using an empirical study generated by a cyber-argumentation tool. This is an improvement over the existing measurements because it does not require ground-truth data or specific settings and can be adapted to distribution-based or distance-based opinions.Keywords: online argumentation, controversy, collective intelligence, agreement analysis, collaborative decision-making, fuzzy logic
Procedia PDF Downloads 116