Search results for: methods%20of%20Lagrange%20multipliers
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14733

Search results for: methods%20of%20Lagrange%20multipliers

14313 The Influence of Covariance Hankel Matrix Dimension on Algorithms for VARMA Models

Authors: Celina Pestano-Gabino, Concepcion Gonzalez-Concepcion, M. Candelaria Gil-Fariña

Abstract:

Some estimation methods for VARMA models, and Multivariate Time Series Models in general, rely on the use of a Hankel matrix. It is known that if the data sample is populous enough and the dimension of the Hankel matrix is unnecessarily large, this may result in an unnecessary number of computations as well as in numerical problems. In this sense, the aim of this paper is two-fold. First, we provide some theoretical results for these matrices which translate into a lower dimension for the matrices normally used in the algorithms. This contribution thus serves to improve those methods from a numerical and, presumably, statistical point of view. Second, we have chosen an estimation algorithm to illustrate in practice our improvements. The results we obtained in a simulation of VARMA models show that an increase in the size of the Hankel matrix beyond the theoretical bound proposed as valid does not necessarily lead to improved practical results. Therefore, for future research, we propose conducting similar studies using any of the linear system estimation methods that depend on Hankel matrices.

Keywords: covariances Hankel matrices, Kronecker indices, system identification, VARMA models

Procedia PDF Downloads 217
14312 Some Issues of Measurement of Impairment of Non-Financial Assets in the Public Sector

Authors: Mariam Vardiashvili

Abstract:

The economic value of the asset impairment process is quite large. Impairment reflects the reduction of future economic benefits or service potentials itemized in the asset. The assets owned by public sector entities bring economic benefits or are used for delivery of the free-of-charge services. Consequently, they are classified as cash-generating and non-cash-generating assets. IPSAS 21 - Impairment of non-cash-generating assets, and IPSAS 26 - Impairment of cash-generating assets, have been designed considering this specificity.  When measuring impairment of assets, it is important to select the relevant methods. For measurement of the impaired Non-Cash-Generating Assets, IPSAS 21 recommends three methods: Depreciated Replacement Cost Approach, Restoration Cost Approach, and  Service Units Approach. Impairment of Value in Use of Cash-Generating Assets (according to IPSAS 26) is measured by discounted value of the money sources to be received in future. Value in use of the cash-generating asserts (as per IPSAS 26) is measured by the discounted value of the money sources to be received in the future. The article provides classification of the assets in the public sector  as non-cash-generating assets and cash-generating assets and, deals also with the factors which should be considered when evaluating  impairment of assets. An essence of impairment of the non-financial assets and the methods of measurement thereof evaluation are formulated according to IPSAS 21 and IPSAS 26. The main emphasis is put on different methods of measurement of the value in use of the impaired Cash-Generating Assets and Non-Cash-Generation Assets and the methods of their selection. The traditional and the expected cash flow approaches for calculation of the discounted value are reviewed. The article also discusses the issues of recognition of impairment loss and its reflection in the financial reporting. The article concludes that despite a functional purpose of the impaired asset, whichever method is used for measuring the asset, presentation of realistic information regarding the value of the assets should be ensured in the financial reporting. In the theoretical development of the issue, the methods of scientific abstraction, analysis and synthesis were used. The research was carried out with a systemic approach. The research process uses international standards of accounting, theoretical researches and publications of Georgian and foreign scientists.

Keywords: cash-generating assets, non-cash-generating assets, recoverable (usable restorative) value, value of use

Procedia PDF Downloads 114
14311 Customizable Sonic EEG Neurofeedback Environment to Train Self-Regulation of Momentary Mental and Emotional State

Authors: Cyril Kaplan, Nikola Jajcay

Abstract:

We developed purely sonic, musical based, highly customizable EEG neurofeedback environment designed to administer a new neurofeedback training protocol. The training protocol concentrates on improving the ability to switch between several mental states characterized by different levels of arousal, each of them correlated to specific brain wave activity patterns in several specific regions of neocortex. This paper describes the neurofeedback training environment we developed and its specificities, thus can be helpful as a manual to guide other neurofeedback users (both researchers and practitioners) interested in our editable open source program (available to download and usage under CC license). Responses and reaction of first trainees that used our environment are presented in this article. Combination of qualitative methods (thematic analysis of neurophenomenological insights of trainees and post-session semi-structured interviews) and quantitative methods (power spectra analysis of EEG recorded during the training) were employed to obtain a multifaceted view on our new training protocol.

Keywords: EEG neurofeedback, mixed methods, self-regulation, switch-between-states training

Procedia PDF Downloads 192
14310 Examining Pre-Consumer Textile Waste Recycling, Barriers to Implementation, and Participant Demographics: A Review of Literature

Authors: Madeline W. Miller

Abstract:

The global textile industry produces pollutants in the form of liquid discharge, solid waste, and emissions into the natural environment. Textile waste resulting from garment production and other manufacturing processes makes a significant contribution to the amount of waste landfilled globally. While the majority of curbside and other convenient recycling methods cater to post-consumer paper and plastics, pre-consumer textile waste is often discarded with trash and is commonly classified as ‘other’ in municipal solid waste breakdowns. On a larger scale, many clothing manufacturers and other companies utilizing textiles have not yet identified or began using the most sustainable methods for discarding their post-industrial, pre-consumer waste. To lessen the amount of waste sent to landfills, there are post-industrial, pre-consumer textile waste recycling methods that can be used to give textiles a new life. This process requires that textile and garment manufacturers redirect their waste to companies that use industrial machinery to shred or fiberize these materials in preparation for their second life. The goal of this literature review is to identify the recycling and reuse challenges faced by producers within the clothing and textile industry that prevent these companies from utilizing the described recycling methods, causing them to opt for landfill. The literature analyzed in this review reflects manufacturer sentiments toward waste disposal and recycling. The results of this review indicate that the cost of logistics is the determining factor when it comes to companies recycling their pre-consumer textile waste and that the most applicable and successful textile waste recycling methods require a company separate from the manufacturer to account for waste production, provide receptacles for waste, arrange waste transport, and identify a secondary use for the material at a price-point below that of traditional waste disposal service.

Keywords: leadership demographics, post-industrial textile waste, pre-consumer textile waste, industrial shoddy

Procedia PDF Downloads 124
14309 Methods for Enhancing Ensemble Learning or Improving Classifiers of This Technique in the Analysis and Classification of Brain Signals

Authors: Seyed Mehdi Ghezi, Hesam Hasanpoor

Abstract:

This scientific article explores enhancement methods for ensemble learning with the aim of improving the performance of classifiers in the analysis and classification of brain signals. The research approach in this field consists of two main parts, each with its own strengths and weaknesses. The choice of approach depends on the specific research question and available resources. By combining these approaches and leveraging their respective strengths, researchers can enhance the accuracy and reliability of classification results, consequently advancing our understanding of the brain and its functions. The first approach focuses on utilizing machine learning methods to identify the best features among the vast array of features present in brain signals. The selection of features varies depending on the research objective, and different techniques have been employed for this purpose. For instance, the genetic algorithm has been used in some studies to identify the best features, while optimization methods have been utilized in others to identify the most influential features. Additionally, machine learning techniques have been applied to determine the influential electrodes in classification. Ensemble learning plays a crucial role in identifying the best features that contribute to learning, thereby improving the overall results. The second approach concentrates on designing and implementing methods for selecting the best classifier or utilizing meta-classifiers to enhance the final results in ensemble learning. In a different section of the research, a single classifier is used instead of multiple classifiers, employing different sets of features to improve the results. The article provides an in-depth examination of each technique, highlighting their advantages and limitations. By integrating these techniques, researchers can enhance the performance of classifiers in the analysis and classification of brain signals. This advancement in ensemble learning methodologies contributes to a better understanding of the brain and its functions, ultimately leading to improved accuracy and reliability in brain signal analysis and classification.

Keywords: ensemble learning, brain signals, classification, feature selection, machine learning, genetic algorithm, optimization methods, influential features, influential electrodes, meta-classifiers

Procedia PDF Downloads 47
14308 Multidirectional Product Support System for Decision Making in Textile Industry Using Collaborative Filtering Methods

Authors: A. Senthil Kumar, V. Murali Bhaskaran

Abstract:

In the information technology ground, people are using various tools and software for their official use and personal reasons. Nowadays, people are worrying to choose data accessing and extraction tools at the time of buying and selling their products. In addition, worry about various quality factors such as price, durability, color, size, and availability of the product. The main purpose of the research study is to find solutions to these unsolved existing problems. The proposed algorithm is a Multidirectional Rank Prediction (MDRP) decision making algorithm in order to take an effective strategic decision at all the levels of data extraction, uses a real time textile dataset and analyzes the results. Finally, the results are obtained and compared with the existing measurement methods such as PCC, SLCF, and VSS. The result accuracy is higher than the existing rank prediction methods.

Keywords: Knowledge Discovery in Database (KDD), Multidirectional Rank Prediction (MDRP), Pearson’s Correlation Coefficient (PCC), VSS (Vector Space Similarity)

Procedia PDF Downloads 258
14307 Geospatial Curve Fitting Methods for Disease Mapping of Tuberculosis in Eastern Cape Province, South Africa

Authors: Davies Obaromi, Qin Yongsong, James Ndege

Abstract:

To interpolate scattered or regularly distributed data, there are imprecise or exact methods. However, there are some of these methods that could be used for interpolating data in a regular grid and others in an irregular grid. In spatial epidemiology, it is important to examine how a disease prevalence rates are distributed in space, and how they relate with each other within a defined distance and direction. In this study, for the geographic and graphic representation of the disease prevalence, linear and biharmonic spline methods were implemented in MATLAB, and used to identify, localize and compare for smoothing in the distribution patterns of tuberculosis (TB) in Eastern Cape Province. The aim of this study is to produce a more “smooth” graphical disease map for TB prevalence patterns by a 3-D curve fitting techniques, especially the biharmonic splines that can suppress noise easily, by seeking a least-squares fit rather than exact interpolation. The datasets are represented generally as a 3D or XYZ triplets, where X and Y are the spatial coordinates and Z is the variable of interest and in this case, TB counts in the province. This smoothing spline is a method of fitting a smooth curve to a set of noisy observations using a spline function, and it has also become the conventional method for its high precision, simplicity and flexibility. Surface and contour plots are produced for the TB prevalence at the provincial level for 2012 – 2015. From the results, the general outlook of all the fittings showed a systematic pattern in the distribution of TB cases in the province and this is consistent with some spatial statistical analyses carried out in the province. This new method is rarely used in disease mapping applications, but it has a superior advantage to be assessed at subjective locations rather than only on a rectangular grid as seen in most traditional GIS methods of geospatial analyses.

Keywords: linear, biharmonic splines, tuberculosis, South Africa

Procedia PDF Downloads 217
14306 A Survey of Dynamic QoS Methods in Sofware Defined Networking

Authors: Vikram Kalekar

Abstract:

Modern Internet Protocol (IP) networks deploy traditional and modern Quality of Service (QoS) management methods to ensure the smooth flow of network packets during regular operations. SDN (Software-defined networking) networks have also made headway into better service delivery by means of novel QoS methodologies. While many of these techniques are experimental, some of them have been tested extensively in controlled environments, and few of them have the potential to be deployed widely in the industry. With this survey, we plan to analyze the approaches to QoS and resource allocation in SDN, and we will try to comment on the possible improvements to QoS management in the context of SDN.

Keywords: QoS, policy, congestion, flow management, latency, delay index terms-SDN, delay

Procedia PDF Downloads 167
14305 Enhancing Rural Agricultural Value Chains through Electric Mobility Services in Ethiopia

Authors: Clemens Pizzinini, Philipp Rosner, David Ziegler, Markus Lienkamp

Abstract:

Transportation is a constitutional part of most supply and value chains in modern economies. Smallholder farmers in rural Ethiopia face severe challenges along their supply and value chains. In particular, suitable, affordable, and available transport services are in high demand. To develop a context-specific technical solutions, a problem-to-solution methodology based on the interaction with technology is developed. With this approach, we fill the gap between proven transportation assessment frameworks and general user-centered techniques. Central to our approach is an electric test vehicle that is implemented in rural supply and value chains for research, development, and testing. Based on our objective and the derived methodological requirements, a set of existing methods is selected. Local partners are integrated into an organizational framework that executes major parts of this research endeavour in the Arsi Zone, Oromia Region, Ethiopia.

Keywords: agricultural value chain, participatory methods, agile methods, sub-Saharan Africa, Ethiopia, electric vehicle, transport service

Procedia PDF Downloads 49
14304 An Overview of Food Waste Management Technologies; The Advantages of Using New Management Methods over the Older Methods to Reduce the Environmental Impacts of Food Waste, Conserve Resources, and Energy Recovery

Authors: Bahareh Asefi, Fereidoun Farzaneh, Ghazaleh Asefi

Abstract:

Continuous increasing food waste produced on a global as well as national scale may lead to burgeoning environmental and economic problems. Simultaneously, decreasing the use efficiencies of natural resources such as land, water, and energy is occurring. On the other hand, food waste has a high-energy content, which seems ideal to achieve dual benefits in terms of energy recovery and the improvement of resource use efficiencies. Therefore, to decrease the environmental impacts of food waste and resource conservation, the researcher has focused on traditional methods of using food waste as a resource through different approaches such as anaerobic digestion, composting, incineration, and landfill. The adverse environmental effects of growing food waste make it difficult for traditional food waste treatment and management methods to balance social, economic, and environmental benefits. The old technology does not need to develop, but several new technologies such as microbial fuel cells, food waste disposal, and bio-converting food waste technology still need to establish or appropriately considered. It is pointed out that some new technologies can take into account various benefits. Since the information about food waste and its management method is critical for executable policy, a review of the latest information regarding the source of food waste and its management technology in some counties is provided in this study.

Keywords: food waste, management technology, innovative method, bio converting food waste, microbial fuel cell

Procedia PDF Downloads 87
14303 An Efficient Propensity Score Method for Causal Analysis With Application to Case-Control Study in Breast Cancer Research

Authors: Ms Azam Najafkouchak, David Todem, Dorothy Pathak, Pramod Pathak, Joseph Gardiner

Abstract:

Propensity score (PS) methods have recently become the standard analysis as a tool for the causal inference in the observational studies where exposure is not randomly assigned, thus, confounding can impact the estimation of treatment effect on the outcome. For the binary outcome, the effect of treatment on the outcome can be estimated by odds ratios, relative risks, and risk differences. However, using the different PS methods may give you a different estimation of the treatment effect on the outcome. Several methods of PS analyses have been used mainly, include matching, inverse probability of weighting, stratification, and covariate adjusted on PS. Due to the dangers of discretizing continuous variables (exposure, covariates), the focus of this paper will be on how the variation in cut-points or boundaries will affect the average treatment effect (ATE) utilizing the stratification of PS method. Therefore, we are trying to avoid choosing arbitrary cut-points, instead, we continuously discretize the PS and accumulate information across all cut-points for inferences. We will use Monte Carlo simulation to evaluate ATE, focusing on two PS methods, stratification and covariate adjusted on PS. We will then show how this can be observed based on the analyses of the data from a case-control study of breast cancer, the Polish Women’s Health Study.

Keywords: average treatment effect, propensity score, stratification, covariate adjusted, monte Calro estimation, breast cancer, case_control study

Procedia PDF Downloads 82
14302 An Approach to Capture, Evaluate and Handle Complexity of Engineering Change Occurrences in New Product Development

Authors: Mohammad Rostami Mehr, Seyed Arya Mir Rashed, Arndt Lueder, Magdalena Missler-Behr

Abstract:

This paper represents the conception that complex problems do not necessarily need a similar complex solution in order to cope with the complexity. Furthermore, a simple solution based on established methods can provide a sufficient way to deal with the complexity. To verify this conception, the presented paper focuses on the field of change management as a part of the new product development process in the automotive sector. In this field, dealing with increasing complexity is essential, while only non-flexible rigid processes that are not designed to handle complexity are available. The basic methodology of this paper can be divided into four main sections: 1) analyzing the complexity of the change management, 2) literature review in order to identify potential solutions and methods, 3) capturing and implementing expertise of experts from the change management field of an automobile manufacturing company and 4) systematical comparison of the identified methods from literature and connecting these with defined requirements of the complexity of the change management in order to develop a solution. As a practical outcome, this paper provides a method to capture the complexity of engineering changes (EC) and includes it within the EC evaluation process, following case-related process guidance to cope with the complexity. Furthermore, this approach supports the conception that dealing with complexity is possible while utilizing rather simple and established methods by combining them into a powerful tool.

Keywords: complexity management, new product development, engineering change management, flexibility

Procedia PDF Downloads 173
14301 Earthquake Classification in Molluca Collision Zone Using Conventional Statistical Methods

Authors: H. J. Wattimanela, U. S. Passaribu, A. N. T. Puspito, S. W. Indratno

Abstract:

Molluca Collision Zone is located at the junction of the Eurasian plate, Australian, Pacific, and the Philippines. Between the Sangihe arc, west of the collision zone, and to the east of Halmahera arc is active collision and convex toward the Molluca Sea. This research will analyze the behavior of earthquake occurrence in Molluca Collision Zone related to the distributions of an earthquake in each partition regions, determining the type of distribution of a occurrence earthquake of partition regions, and the mean occurrence of earthquakes each partition regions, and the correlation between the partitions region. We calculate number of earthquakes using partition method and its behavioral using conventional statistical methods. The data used is the data type of shallow earthquakes with magnitudes ≥ 4 SR for the period 1964-2013 in the Molluca Collision Zone. From the results, we can classify partitioned regions based on the correlation into two classes: strong and very strong. This classification can be used for early warning system in disaster management.

Keywords: molluca collision zone, partition regions, conventional statistical methods, earthquakes, classifications, disaster management

Procedia PDF Downloads 470
14300 The Effect of Simultaneous Application of Laser Beam and Magnet in Treatment of Intervertebral Disc Herniation

Authors: Alireza Moghtaderi, Negin Khakpour

Abstract:

Disc Herniation is a common complication in the society and it is one of the main reasons for referring to physical medicine and rehabilitation clinics. Despite of various methods proposed for treatingthis disease, still there is disagreement on success of these methods especially in non-surgical methods, and thus current study aims at determining effect of laser beam and magnet on treatment of Intervertebral Disc Herniation. During a clinical trial study, 80 patients with Intervertebral Disc Herniation underwent a combined package of treatment including magnet, laser beam, PRP and Prolotherapy during 6 months. Average age of patients was 51.25 ± 10.7 with range of 25 – 71 years. 30 men (37.5%) and 50 women (62.5%) took part in the study. average weight of patients was 64.3 ± 7.2 with range of 49 – 79 kg. highest level of Disc Herniation was L5 – S1 with frequency of 17 cases (21.3%). Disc Herniation was severe in 30 cases before treatment, but it reduced to 3 casesafter treatment. This study indicates effect of combined treatment using non-invasive laser beam and magnet therapy on disco genic diseases and mechanical pains of spine is highly effective.

Keywords: hallux, valgus, botulinum toxin a, pain

Procedia PDF Downloads 70
14299 Observed Changes in Constructed Precipitation at High Resolution in Southern Vietnam

Authors: Nguyen Tien Thanh, Günter Meon

Abstract:

Precipitation plays a key role in water cycle, defining the local climatic conditions and in ecosystem. It is also an important input parameter for water resources management and hydrologic models. With spatial continuous data, a certainty of discharge predictions or other environmental factors is unquestionably better than without. This is, however, not always willingly available to acquire for a small basin, especially for coastal region in Vietnam due to a low network of meteorological stations (30 stations) on long coast of 3260 km2. Furthermore, available gridded precipitation datasets are not fine enough when applying to hydrologic models. Under conditions of global warming, an application of spatial interpolation methods is a crucial for the climate change impact studies to obtain the spatial continuous data. In recent research projects, although some methods can perform better than others do, no methods draw the best results for all cases. The objective of this paper therefore, is to investigate different spatial interpolation methods for daily precipitation over a small basin (approximately 400 km2) located in coastal region, Southern Vietnam and find out the most efficient interpolation method on this catchment. The five different interpolation methods consisting of cressman, ordinary kriging, regression kriging, dual kriging and inverse distance weighting have been applied to identify the best method for the area of study on the spatio-temporal scale (daily, 10 km x 10 km). A 30-year precipitation database was created and merged into available gridded datasets. Finally, observed changes in constructed precipitation were performed. The results demonstrate that the method of ordinary kriging interpolation is an effective approach to analyze the daily precipitation. The mixed trends of increasing and decreasing monthly, seasonal and annual precipitation have documented at significant levels.

Keywords: interpolation, precipitation, trend, vietnam

Procedia PDF Downloads 258
14298 Students’ and Clinical Supervisors’ Experiences of Occupational Therapy Practice Education: A Structured Critical Review

Authors: Hamad Alhamad, Catriona Khamisha, Emma Green, Yvonne Robb

Abstract:

Introduction: Practice education is a key component of occupational therapy education. This critical review aimed to explore students’ and clinical supervisors’ experiences of practice education, and to make recommendations for research. Method: The literature was systematically searched using five databases. Qualitative, quantitative and mixed methods studies were included. Critical Appraisal Skills Programme checklist for qualitative studies and Mixed Methods Assessment Tool for quantitative and mixed methods studies were used to assess study quality. Findings: Twenty-two studies with high quality scores were included: 16 qualitative, 3 quantitative and 3 mixed methods. Studies were conducted in Australia, Canada, USA and UK. During practice education, students learned professional skills, practical skills, clinical skills and problem-solving skills, and improved confidence and creativity. Supervisors had an opportunity to reflect on their practice and get experience of supervising students. However, clear objectives and expectations for students, and sufficient theoretical knowledge, preparation and resources for supervisors were required. Conclusion: Practice education provides different skills and experiences, necessary to become competent professionals; but some areas of practice education need to improve. Studies in non-western countries are needed to explore the perspectives of students and clinical supervisors in different cultures, to ensure the practice education models adopted are relevant.

Keywords: occupational therapy, practice education, fieldwork, students, clinical supervisors

Procedia PDF Downloads 178
14297 Statistical and Land Planning Study of Tourist Arrivals in Greece during 2005-2016

Authors: Dimitra Alexiou

Abstract:

During the last 10 years, in spite of the economic crisis, the number of tourists arriving in Greece has increased, particularly during the tourist season from April to October. In this paper, the number of annual tourist arrivals is studied to explore their preferences with regard to the month of travel, the selected destinations, as well the amount of money spent. The collected data are processed with statistical methods, yielding numerical and graphical results. From the computation of statistical parameters and the forecasting with exponential smoothing, useful conclusions are arrived at that can be used by the Greek tourism authorities, as well as by tourist organizations, for planning purposes for the coming years. The results of this paper and the computed forecast can also be used for decision making by private tourist enterprises that are investing in Greece. With regard to the statistical methods, the method of Simple Exponential Smoothing of time series of data is employed. The search for a best forecast for 2017 and 2018 provides the value of the smoothing coefficient. For all statistical computations and graphics Microsoft Excel is used.

Keywords: tourism, statistical methods, exponential smoothing, land spatial planning, economy

Procedia PDF Downloads 236
14296 Estimating Estimators: An Empirical Comparison of Non-Invasive Analysis Methods

Authors: Yan Torres, Fernanda Simoes, Francisco Petrucci-Fonseca, Freddie-Jeanne Richard

Abstract:

The non-invasive samples are an alternative of collecting genetic samples directly. Non-invasive samples are collected without the manipulation of the animal (e.g., scats, feathers and hairs). Nevertheless, the use of non-invasive samples has some limitations. The main issue is degraded DNA, leading to poorer extraction efficiency and genotyping. Those errors delayed for some years a widespread use of non-invasive genetic information. Possibilities to limit genotyping errors can be done using analysis methods that can assimilate the errors and singularities of non-invasive samples. Genotype matching and population estimation algorithms can be highlighted as important analysis tools that have been adapted to deal with those errors. Although, this recent development of analysis methods there is still a lack of empirical performance comparison of them. A comparison of methods with dataset different in size and structure can be useful for future studies since non-invasive samples are a powerful tool for getting information specially for endangered and rare populations. To compare the analysis methods, four different datasets used were obtained from the Dryad digital repository were used. Three different matching algorithms (Cervus, Colony and Error Tolerant Likelihood Matching - ETLM) are used for matching genotypes and two different ones for population estimation (Capwire and BayesN). The three matching algorithms showed different patterns of results. The ETLM produced less number of unique individuals and recaptures. A similarity in the matched genotypes between Colony and Cervus was observed. That is not a surprise since the similarity between those methods on the likelihood pairwise and clustering algorithms. The matching of ETLM showed almost no similarity with the genotypes that were matched with the other methods. The different cluster algorithm system and error model of ETLM seems to lead to a more criterious selection, although the processing time and interface friendly of ETLM were the worst between the compared methods. The population estimators performed differently regarding the datasets. There was a consensus between the different estimators only for the one dataset. The BayesN showed higher and lower estimations when compared with Capwire. The BayesN does not consider the total number of recaptures like Capwire only the recapture events. So, this makes the estimator sensitive to data heterogeneity. Heterogeneity in the sense means different capture rates between individuals. In those examples, the tolerance for homogeneity seems to be crucial for BayesN work properly. Both methods are user-friendly and have reasonable processing time. An amplified analysis with simulated genotype data can clarify the sensibility of the algorithms. The present comparison of the matching methods indicates that Colony seems to be more appropriated for general use considering a time/interface/robustness balance. The heterogeneity of the recaptures affected strongly the BayesN estimations, leading to over and underestimations population numbers. Capwire is then advisable to general use since it performs better in a wide range of situations.

Keywords: algorithms, genetics, matching, population

Procedia PDF Downloads 119
14295 Bacteriological Culture Methods and its Uses in Clinical Pathology

Authors: Prachi Choudhary, Jai Gopal Sharma

Abstract:

Microbial cultures determine the type of organism, its abundance in the tested sample, or both. It is one of the primary diagnostic methods of microbiology. It is used to determine the cause of infectious disease by letting the agent multiply in a predetermined medium. Different bacteria produce colonies that may be very distinct from the bacterial species that produced them. To culture any pathogen or microorganism, we should first know about the types of media used in microbiology for culturing. Sometimes sub culturing is also done in various microorganisms if some mixed growth is seen in culture. Nearly 3 types of culture media based on consistency – solid, semi-solid, and liquid (broth) media; are further explained in the report. Then, The Five I's approach is a method for locating, growing, observing, and characterizing microorganisms, including inoculation and incubation. Isolation, inspection, and identification. For identification of bacteria, we have to culture the sample like urine, sputum, blood, etc., on suitable media; there are different methods of culturing the bacteria or microbe like pour plate method, streak plate method, swabbing by needle, pipetting, inoculation by loop, spreading by spreader, etc. After this, we see the bacterial growth after incubation of 24 hours, then according to the growth of bacteria antibiotics susceptibility test is conducted; this is done for sensitive antibiotics or resistance to that bacteria, and also for knowing the name of bacteria. Various methods like the dilution method, disk diffusion method, E test, etc., do antibiotics susceptibility tests. After that, various medicines are provided to the patients according to antibiotic sensitivity and resistance.

Keywords: inoculation, incubation, isolation, antibiotics suspectibility test, characterizing

Procedia PDF Downloads 51
14294 Analysis of Non-Coding Genome in Streptococcus pneumoniae for Molecular Epidemiology Typing

Authors: Martynova Alina, Lyubov Buzoleva

Abstract:

Streptococcus pneumoniae is the causative agent of pneumonias and meningitids throught all the world. Having high genetic diversity, this microorganism can cause different clinical forms of pneumococcal infections and microbiologically it is really difficult diagnosed by routine methods. Also, epidemiological surveillance requires more developed methods of molecular typing because the recent method of serotyping doesn't allow to distinguish invasive and non-invasive isolates properly. Non-coding genome of bacteria seems to be the interesting source for seeking of highly distinguishable markers to discriminate the subspecies of such a variable bacteria as Streptococcus pneumoniae. Technically, we proposed scheme of discrimination of S.pneumoniae strains with amplification of non-coding region (SP_1932) with the following restriction with 2 types of enzymes of Alu1 and Mn1. Aim: This research aimed to compare different methods of typing and their application for molecular epidemiology purposes. Methods: we analyzed population of 100 strains of S.pneumoniae isolated from different patients by different molecular epidemiology methods such as pulse-field gel electophoresis (PFGE), restriction polymorphism analysis (RFLP) and multilolocus sequence typing (MLST), and all of them were compared with classic typing method as serotyping. The discriminative power was estimated with Simpson Index (SI). Results: We revealed that the most discriminative typing method is RFLP (SI=0,97, there were distinguished 42 genotypes).PFGE was slightly less discriminative (SI=0,95, we identified 35 genotypes). MLST is still the best reference method (SI=1.0). Classic method of serotyping showed quite weak discriminative power (SI=0,93, 24 genotypes). In addition, sensivity of RFLP was 100%, specificity was 97,09%. Conclusion: the most appropriate method for routine epidemiology surveillance is RFLP with non-coding region of Streptococcsu pneumoniae, then PFGE, though in some cases these results should be obligatory confirmed by MLST.

Keywords: molecular epidemiology typing, non-coding genome, Streptococcus pneumoniae, MLST

Procedia PDF Downloads 367
14293 Multilabel Classification with Neural Network Ensemble Method

Authors: Sezin Ekşioğlu

Abstract:

Multilabel classification has a huge importance for several applications, it is also a challenging research topic. It is a kind of supervised learning that contains binary targets. The distance between multilabel and binary classification is having more than one class in multilabel classification problems. Features can belong to one class or many classes. There exists a wide range of applications for multi label prediction such as image labeling, text categorization, gene functionality. Even though features are classified in many classes, they may not always be properly classified. There are many ensemble methods for the classification. However, most of the researchers have been concerned about better multilabel methods. Especially little ones focus on both efficiency of classifiers and pairwise relationships at the same time in order to implement better multilabel classification. In this paper, we worked on modified ensemble methods by getting benefit from k-Nearest Neighbors and neural network structure to address issues within a beneficial way and to get better impacts from the multilabel classification. Publicly available datasets (yeast, emotion, scene and birds) are performed to demonstrate the developed algorithm efficiency and the technique is measured by accuracy, F1 score and hamming loss metrics. Our algorithm boosts benchmarks for each datasets with different metrics.

Keywords: multilabel, classification, neural network, KNN

Procedia PDF Downloads 130
14292 Flicker Detection with Motion Tolerance for Embedded Camera

Authors: Jianrong Wu, Xuan Fu, Akihiro Higashi, Zhiming Tan

Abstract:

CMOS image sensors with a rolling shutter are used broadly in the digital cameras embedded in mobile devices. The rolling shutter suffers the flicker artifacts from the fluorescent lamp, and it could be observed easily. In this paper, the characteristics of illumination flicker in motion case were analyzed, and two efficient detection methods based on matching fragment selection were proposed. According to the experimental results, our methods could achieve as high as 100% accuracy in static scene, and at least 97% in motion scene.

Keywords: illumination flicker, embedded camera, rolling shutter, detection

Procedia PDF Downloads 396
14291 Impact Position Method Based on Distributed Structure Multi-Agent Coordination with JADE

Authors: YU Kaijun, Liang Dong, Zhang Yarong, Jin Zhenzhou, Yang Zhaobao

Abstract:

For the impact monitoring of distributed structures, the traditional positioning methods are based on the time difference, which includes the four-point arc positioning method and the triangulation positioning method. But in the actual operation, these two methods have errors. In this paper, the Multi-Agent Blackboard Coordination Principle is used to combine the two methods. Fusion steps: (1) The four-point arc locating agent calculates the initial point and records it to the Blackboard Module.(2) The triangulation agent gets its initial parameters by accessing the initial point.(3) The triangulation agent constantly accesses the blackboard module to update its initial parameters, and it also logs its calculated point into the blackboard.(4) When the subsequent calculation point and the initial calculation point are within the allowable error, the whole coordination fusion process is finished. This paper presents a Multi-Agent collaboration method whose agent framework is JADE. The JADE platform consists of several agent containers, with the agent running in each container. Because of the perfect management and debugging tools of the JADE, it is very convenient to deal with complex data in a large structure. Finally, based on the data in Jade, the results show that the impact location method based on Multi-Agent coordination fusion can reduce the error of the two methods.

Keywords: impact monitoring, structural health monitoring(SHM), multi-agent system(MAS), black-board coordination, JADE

Procedia PDF Downloads 152
14290 Modification of Underwood's Equation to Calculate Minimum Reflux Ratio for Column with One Side Stream Upper Than Feed

Authors: S. Mousavian, A. Abedianpour, A. Khanmohammadi, S. Hematian, Gh. Eidi Veisi

Abstract:

Distillation is one of the most important and utilized separation methods in the industrial practice. There are different ways to design of distillation column. One of these ways is short cut method. In short cut method, material balance and equilibrium are employed to calculate number of tray in distillation column. There are different methods that are classified in short cut method. One of these methods is Fenske-Underwood-Gilliland method. In this method, minimum reflux ratio should be calculated by underwood equation. Underwood proposed an equation that is useful for simple distillation column with one feed and one top and bottom product. In this study, underwood method is developed to predict minimum reflux ratio for column with one side stream upper than feed. The result of this model compared with McCabe-Thiele method. The result shows that proposed method able to calculate minimum reflux ratio with very small error.

Keywords: minimum reflux ratio, side stream, distillation, Underwood’s method

Procedia PDF Downloads 377
14289 Determination of Biological Efficiency Values of Some Pesticide Application Methods under Second Crop Maize Conditions

Authors: Ali Bolat, Ali Bayat, Mustafa Gullu

Abstract:

Maize can be cultivated both under main and second crop conditions in Turkey. Main pests of maize under second crop conditions are Sesamia nonagrioides Lefebvre (Lepidoptera: Noctuidae) and Ostrinia nubilalis Hübner (Lepidoptera: Crambidae). Aerial spraying applications to control these two main maize pests can be carried out until 2006 in Turkey before it was banned due to environmental concerns like drifting of sprayed pestisides and low biological efficiency. In this context, pulverizers which can spray tall maize plants ( > 175 cm) from the ground have begun to be used. However, the biological efficiency of these sprayers is unknown. Some methods have been tested to increase the success of ground spraying in field experiments conducted in second crop maize in 2008 and 2009. For this aim, 6 spraying methods (air assisted spraying with TX cone jet, domestic cone nozzles, twinjet nozzles, air induction nozzles, standard domestic cone nozzles and tail booms) were used at two application rates (150 and 300 l.ha-1) by a sprayer. In the study, biological efficacy evaluations of each methods were measured in each parcel. Biological efficacy evaluations included counts of number of insect damaged plants, number of holes in stems and live larvae and pupa in stems of selected plants. As a result, the highest biological efficacy value (close to 70%) was obtained from Air Assisted Spraying method at 300 l / ha application volume.

Keywords: air assisted sprayer, drift nozzles, biological efficiency, maize plant

Procedia PDF Downloads 186
14288 Comparison between XGBoost, LightGBM and CatBoost Using a Home Credit Dataset

Authors: Essam Al Daoud

Abstract:

Gradient boosting methods have been proven to be a very important strategy. Many successful machine learning solutions were developed using the XGBoost and its derivatives. The aim of this study is to investigate and compare the efficiency of three gradient methods. Home credit dataset is used in this work which contains 219 features and 356251 records. However, new features are generated and several techniques are used to rank and select the best features. The implementation indicates that the LightGBM is faster and more accurate than CatBoost and XGBoost using variant number of features and records.

Keywords: gradient boosting, XGBoost, LightGBM, CatBoost, home credit

Procedia PDF Downloads 135
14287 Software Development for AASHTO and Ethiopian Roads Authority Flexible Pavement Design Methods

Authors: Amare Setegn Enyew, Bikila Teklu Wodajo

Abstract:

The primary aim of flexible pavement design is to ensure the development of economical and safe road infrastructure. However, failures can still occur due to improper or erroneous structural design. In Ethiopia, the design of flexible pavements relies on doing calculations manually and selecting pavement structure from catalogue. The catalogue offers, in eight different charts, alternative structures for combinations of traffic and subgrade classes, as outlined in the Ethiopian Roads Authority (ERA) Pavement Design Manual 2001. Furthermore, design modification is allowed in accordance with the structural number principles outlined in the AASHTO 1993 Guide for Design of Pavement Structures. Nevertheless, the manual calculation and design process involves the use of nomographs, charts, tables, and formulas, which increases the likelihood of human errors and inaccuracies, and this may lead to unsafe or uneconomical road construction. To address the challenge, a software called AASHERA has been developed for AASHTO 1993 and ERA design methods, using MATLAB language. The software accurately determines the required thicknesses of flexible pavement surface, base, and subbase layers for the two methods. It also digitizes design inputs and references like nomographs, charts, default values, and tables. Moreover, the software allows easier comparison of the two design methods in terms of results and cost of construction. AASHERA's accuracy has been confirmed through comparisons with designs from handbooks and manuals. The software can aid in reducing human errors, inaccuracies, and time consumption as compared to the conventional manual design methods employed in Ethiopia. AASHERA, with its validated accuracy, proves to be an indispensable tool for flexible pavement structure designers.

Keywords: flexible pavement design, AASHTO 1993, ERA, MATLAB, AASHERA

Procedia PDF Downloads 40
14286 Virtual Customer Integration in Innovation Development: A Systematic Literature Review

Authors: Chau Nguyen Pham Minh

Abstract:

The aim of this study is to answer the following research question: What do we know about virtual customer integration in innovation development based on existing empirical research? The paper is based on a systematic review of 136 articles which were published in the past 16 years. The analysis focuses on three areas: what forms of virtual customer integration (e.g. netnography, online co-creation, virtual experience) have been applied in innovation development; how have virtual customer integration methods effectively been utilized by firms; and what are the influences of virtual customer integration on innovation development activities? Through the detailed analysis, the study provides researchers with broad understanding about virtual customer integration in innovation development. The study shows that practitioners and researchers increasingly pay attention on using virtual customer integration methods in developing innovation since those methods have dominant advantages in interact with customers in order to generate the best ideas for innovation development. Additionally, the findings indicate that netnography has been the most common method in integrating with customers for idea generation; while virtual product experience has been mainly used in product testing. Moreover, the analysis also reveals the positive and negative influences of virtual customer integration in innovation development from both process and strategic perspectives. Most of the review studies examined the phenomenon from company’s perspectives to understand the process of applying virtual customer integration methods and their impacts; however, the customers’ perspective on participating in the virtual interaction has been inadequately studied; therefore, it creates many potential interesting research paths for future studies.

Keywords: innovation, virtual customer integration, co-creation, netnography, new product development

Procedia PDF Downloads 310
14285 Interior Design: Changing Values

Authors: Kika Ioannou Kazamia

Abstract:

This paper examines the action research cycle of the second phase of longitudinal research on sustainable interior design practices, between two groups of stakeholders, designers and clients. During this phase of the action research, the second step - the change stage - of Lewin’s change management model has been utilized to change values, approaches, and attitudes toward sustainable design practices among the participants. Affective domain learning theory is utilized to attach new values. Learning with the use of information technology, collaborative learning, and problem-based learning are the learning methods implemented toward the acquisition of the objectives. Learning methods, and aims, require the design of interventions with participants' involvement in activities that would lead to the acknowledgment of the benefits of sustainable practices. Interventions are steered to measure participants’ decisions for the worth and relevance of ideas, and experiences; accept or commit to a particular stance or action. The data collection methods used in this action research are observers’ reports, participants' questionnaires, and interviews. The data analyses use both quantitative and qualitative methods. The main beneficial aspect of the quantitative method was to provide the means to separate many factors that obscured the main qualitative findings. The qualitative method allowed data to be categorized, to adapt the deductive approach, and then examine for commonalities that could reflect relevant categories or themes. The results from the data indicate that during the second phase, designers and clients' participants altered their behaviours.

Keywords: design, change, sustainability, learning, practices

Procedia PDF Downloads 49
14284 Implementation of ADETRAN Language Using Message Passing Interface

Authors: Akiyoshi Wakatani

Abstract:

This paper describes the Message Passing Interface (MPI) implementation of ADETRAN language, and its evaluation on SX-ACE supercomputers. ADETRAN language includes pdo statement that specifies the data distribution and parallel computations and pass statement that specifies the redistribution of arrays. Two methods for implementation of pass statement are discussed and the performance evaluation using Splitting-Up CG method is presented. The effectiveness of the parallelization is evaluated and the advantage of one dimensional distribution is empirically confirmed by using the results of experiments.

Keywords: iterative methods, array redistribution, translator, distributed memory

Procedia PDF Downloads 245