Search results for: analysis methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 36528

Search results for: analysis methods

36108 The Estimation of Human Vital Signs Complexity

Authors: L. Bikulciene, E. Venskaityte, G. Jarusevicius

Abstract:

Non-stationary and nonlinear signals generated by living complex systems defy traditional mechanistic approaches, which are based on homeostasis. Previous our studies have shown that the evaluation of the interactions of physiological signals by using special analysis methods is suitable for observation of physiological processes. It is demonstrated the possibility of using deep physiological model, based interpretation of the changes of the human body’s functional states combined with an application of the analytical method based on matrix theory for the physiological signals analysis, which was applied on high risk cardiac patients. It is shown that evaluation of cardiac signals interactions show peculiar for each individual functional changes at the onset of hemodynamic restoration procedure. Therefore we suggest that the alterations of functional state of the body, after patients overcome surgery can be complemented by the data received from the suggested approach of the evaluation of functional variables interactions.

Keywords: cardiac diseases, complex systems theory, ECG analysis, matrix analysis

Procedia PDF Downloads 325
36107 Protein Remote Homology Detection by Using Profile-Based Matrix Transformation Approaches

Authors: Bin Liu

Abstract:

As one of the most important tasks in protein sequence analysis, protein remote homology detection has been studied for decades. Currently, the profile-based methods show state-of-the-art performance. Position-Specific Frequency Matrix (PSFM) is widely used profile. However, there exists noise information in the profiles introduced by the amino acids with low frequencies. In this study, we propose a method to remove the noise information in the PSFM by removing the amino acids with low frequencies called Top frequency profile (TFP). Three new matrix transformation methods, including Autocross covariance (ACC) transformation, Tri-gram, and K-separated bigram (KSB), are performed on these profiles to convert them into fixed length feature vectors. Combined with Support Vector Machines (SVMs), the predictors are constructed. Evaluated on two benchmark datasets, and experimental results show that these proposed methods outperform other state-of-the-art predictors.

Keywords: protein remote homology detection, protein fold recognition, top frequency profile, support vector machines

Procedia PDF Downloads 106
36106 Family Planning Use among Women Living with HIV in Malawi: Analysis from Malawi DHS-2010 Data

Authors: Dereje Habte, Jane Namasasu

Abstract:

Background: The aim of the analysis was to assess the practice of family planning (FP) among HIV-infected women and the influence of women’s awareness of HIV-positive status in the practice of FP. Methods: The analysis was made among 489 non-pregnant, sexually active, fecund women living with HIV. Result: Of the 489 confirmed HIV positive women, 184 (37.6%) reported that they knew they are HIV positive. The number of women with current use and unmet need of any family planning method were found to be 251 (51.2%) and 107 (21.9%) respectively. Women’s knowledge of HIV-positive status (AOR: 2.32(1.54,3.50)), secondary and above education (AOR: 2.36(1.16,4.78)), presence of 3-4 (AOR: 2.60(1.08,6.28)) and more than four alive children (AOR: 3.03(1.18,7.82)) were significantly associated with current use of family planning. Conclusion: Women’s awareness of HIV-positive status was found to significantly predict family planning practice among women living with HIV.

Keywords: family planning, HIV, Malawi, women

Procedia PDF Downloads 583
36105 Statistical and Analytical Comparison of GIS Overlay Modelings: An Appraisal on Groundwater Prospecting in Precambrian Metamorphics

Authors: Tapas Acharya, Monalisa Mitra

Abstract:

Overlay modeling is the most widely used conventional analysis for spatial decision support system. Overlay modeling requires a set of themes with different weightage computed in varied manners, which gives a resultant input for further integrated analysis. In spite of the popularity and most widely used technique; it gives inconsistent and erroneous results for similar inputs while processed in various GIS overlay techniques. This study is an attempt to compare and analyse the differences in the outputs of different overlay methods using GIS platform with same set of themes of the Precambrian metamorphic to obtain groundwater prospecting in Precambrian metamorphic rocks. The objective of the study is to emphasize the most suitable overlay method for groundwater prospecting in older Precambrian metamorphics. Seven input thematic layers like slope, Digital Elevation Model (DEM), soil thickness, lineament intersection density, average groundwater table fluctuation, stream density and lithology have been used in the spatial overlay models of fuzzy overlay, weighted overlay and weighted sum overlay methods to yield the suitable groundwater prospective zones. Spatial concurrence analysis with high yielding wells of the study area and the statistical comparative studies among the outputs of various overlay models using RStudio reveal that the Weighted Overlay model is the most efficient GIS overlay model to delineate the groundwater prospecting zones in the Precambrian metamorphic rocks.

Keywords: fuzzy overlay, GIS overlay model, groundwater prospecting, Precambrian metamorphics, weighted overlay, weighted sum overlay

Procedia PDF Downloads 109
36104 Application of Machine Learning Techniques in Forest Cover-Type Prediction

Authors: Saba Ebrahimi, Hedieh Ashrafi

Abstract:

Predicting the cover type of forests is a challenge for natural resource managers. In this project, we aim to perform a comprehensive comparative study of two well-known classification methods, support vector machine (SVM) and decision tree (DT). The comparison is first performed among different types of each classifier, and then the best of each classifier will be compared by considering different evaluation metrics. The effect of boosting and bagging for decision trees is also explored. Furthermore, the effect of principal component analysis (PCA) and feature selection is also investigated. During the project, the forest cover-type dataset from the remote sensing and GIS program is used in all computations.

Keywords: classification methods, support vector machine, decision tree, forest cover-type dataset

Procedia PDF Downloads 192
36103 Quantifying Product Impacts on Biodiversity: The Product Biodiversity Footprint

Authors: Leveque Benjamin, Rabaud Suzanne, Anest Hugo, Catalan Caroline, Neveux Guillaume

Abstract:

Human products consumption is one of the main drivers of biodiversity loss. However, few pertinent ecological indicators regarding product life cycle impact on species and ecosystems have been built. Life cycle assessment (LCA) methodologies are well under way to conceive standardized methods to assess this impact, by taking already partially into account three of the Millennium Ecosystem Assessment pressures (land use, pollutions, climate change). Coupling LCA and ecological data and methods is an emerging challenge to develop a product biodiversity footprint. This approach was tested on three case studies from food processing, textile, and cosmetic industries. It allowed first to improve the environmental relevance of the Potential Disappeared Fraction of species, end-point indicator typically used in life cycle analysis methods, and second to introduce new indicators on overexploitation and invasive species. This type of footprint is a major step in helping companies to identify their impacts on biodiversity and to propose potential improvements.

Keywords: biodiversity, companies, footprint, life cycle assessment, products

Procedia PDF Downloads 308
36102 A Levelized Cost Analysis for Solar Energy Powered Sea Water Desalination in the Arabian Gulf Region

Authors: Abdullah Kaya, Muammer Koc

Abstract:

A levelized cost analysis of solar energy powered seawater desalination in The Emirate of Abu Dhabi is conducted to show that clean and renewable desalination is economically viable. The Emirate heavily relies on seawater desalination for its freshwater needs due to limited freshwater resources available. This trend is expected to increase further due to growing population and economic activity, rapid decline in limited freshwater reserves, and aggravating effects of climate change. Seawater desalination in Abu Dhabi is currently done through thermal desalination technologies such as multi-stage flash (MSF) and multi-effect distillation (MED) which are coupled with thermal power plants known as co-generation. Our analysis indicates that these thermal desalination methods are inefficient regarding energy consumption and harmful to the environment due to CO₂ emissions and other dangerous byproducts. Therefore, utilization of clean and renewable desalination options has become a must for The Emirate for the transition to a sustainable future. The rapid decline in the cost of solar PV system for energy production and RO technology for desalination makes the combination of these two an ideal option for a future of sustainable desalination in the Emirate of Abu Dhabi. A Levelized cost analysis for water produced by solar PV + RO system indicates that Abu Dhabi is well positioned to utilize this technological combination for cheap and clean desalination for the coming years. It has been shown that cap-ex cost of solar PV powered RO system has potential to go as low as to 101 million US $ (1111 $/m³) at best case considering the recent technological developments. The levelized cost of water (LCW) values fluctuate between 0.34 $/m³ for the baseline case and 0.27 $/m³ for the best case. Even the highly conservative case yields LCW cheaper than 100% from all thermal desalination methods currently employed in the Emirate. Exponential cost decreases in both solar PV and RO sectors along with increasing economic scale globally signal the fact that a cheap and clean desalination can be achieved by the combination of these technologies.

Keywords: solar PV, RO desalination, sustainable desalination, levelized cost of analysis, Emirate of Abu Dhabi

Procedia PDF Downloads 147
36101 Comparative Analysis of Enzyme Activities Concerned in Decomposition of Toluene

Authors: Ayuko Itsuki, Sachiyo Aburatani

Abstract:

In recent years, pollutions of the environment by toxic substances become a serious problem. While there are many methods of environmental clean-up, the methods by microorganisms are considered to be reasonable and safety for environment. Compost is known that it catabolize the meladorous substancess in its production process, however the mechanism of its catabolizing system is not known yet. In the catabolization process, organic matters turn into inorganic by the released enzymes from lots of microorganisms which live in compost. In other words, the cooperative of activated enzymes in the compost decomposes malodorous substances. Thus, clarifying the interaction among enzymes is important for revealing the catabolizing system of meladorous substance in compost. In this study, we utilized statistical method to infer the interaction among enzymes. We developed a method which combined partial correlation with cross correlation to estimate the relevance between enzymes especially from time series data of few variables. Because of using cross correlation, we can estimate not only the associative structure but also the reaction pathway. We applied the developed method to the enzyme measured data and estimated an interaction among the enzymes in decomposition mechanism of toluene.

Keywords: enzyme activities, comparative analysis, compost, toluene

Procedia PDF Downloads 253
36100 Renovation of Pipeline in Residential Buildings by Polymeric Composites

Authors: Parastou Kharazmi

Abstract:

In this paper, rehabilitation methods for pipeline by advanced polymeric coating such as relining are reviewed. A number of diverse methods which are globally used are described and a brief summary of advances in technology, methods and materials is provided. The paper explains why it is claimed that sewerage rehabilitation with relining in residential buildings is environmentally friendly and economical, the importance of the quality control procedure is discussed and several quality tests are proposed.

Keywords: buildings, composite, material, renovation

Procedia PDF Downloads 252
36099 Assessment of Soil Quality Indicators in Rice Soils Under Rainfed Ecosystem

Authors: R. Kaleeswari

Abstract:

An investigation was carried out to assess the soil biological quality parameters in rice soils under rainfed and to compare soil quality indexing methods viz., Principal component analysis, Minimum data set and Indicator scoring method and to develop soil quality indices for formulating soil and crop management strategies.Soil samples were collected and analyzed for soil biological properties by adopting standard procedure. Biological indicators were determined for soil quality assessment, viz., microbial biomass carbon and nitrogen (MBC and MBN), potentially mineralizable nitrogen (PMN) and soil respiration and dehydrogenease activity. Among the methods of rice cultivation, Organic nutrition, Integrated Nutrient Management (INM) and System of Rice Intensification (SRI ), rice cultivation registered higher values of MBC, MBN and PMN. Mechanical and conventional rice cultivation registered lower values of biological quality indicators. Organic nutrient management and INM enhanced the soil respiration rate. SRI and aerobic rice cultivation methods increased the rate of soil respiration, while conventional and mechanical rice farming lowered the soil respiration rate. Dehydrogenase activity (DHA) was registered to be higher in soils under organic nutrition and Integrated Nutrient Management INM. System of Rice Intensification SRI and aerobic rice cultivation enhanced the DHA; while conventional and mechanical rice cultivation methods reduced DHA. The microbial biomass carbon (MBC) of the rice soils varied from 65 to 244 mg kg-1. Among the nutrient management practices, INM registered the highest available microbial biomass carbon of 285 mg kg-1.Potentially mineralizable N content of the rice soils varied from 20.3 to 56.8 mg kg-1. Aerobic rice farming registered the highest potentially mineralizable N of 78.9 mg kg-1..The soil respiration rate of the rice soils varied from 60 to 125 µgCO2 g-1. Nutrient management practices ofINM practice registered the highest. soil respiration rate of 129 µgCO2 g-1.The dehydrogenase activity of the rice soils varied from 38.3 to 135.3µgTPFg-1 day-1. SRI method of rice cultivation registered the highest dehydrogenase activity of 160.2 µgTPFg-1 day-1. Soil variables from each PC were considered for minimum soil data set (MDS). Principal component analysis (PCA) was used to select the representative soil quality indicators. In intensive rice cultivating regions, soil quality indicators were selected based on factor loading value and contribution percentage value using principal component analysis (PCA).Variables having significant difference within production systems were used for the preparation of minimum data set (MDS).

Keywords: soil quality, rice, biological properties, PCA analysis

Procedia PDF Downloads 83
36098 Application of Two Stages Adaptive Neuro-Fuzzy Inference System to Improve Dissolved Gas Analysis Interpretation Techniques

Authors: Kharisma Utomo Mulyodinoto, Suwarno, A. Abu-Siada

Abstract:

Dissolved Gas Analysis is one of impressive technique to detect and predict internal fault of transformers by using gas generated by transformer oil sample. A number of methods are used to interpret the dissolved gas from transformer oil sample: Doernenberg Ratio Method, IEC (International Electrotechnical Commission) Ratio Method, and Duval Triangle Method. While the assessment of dissolved gas within transformer oil samples has been standardized over the past two decades, analysis of the results is not always straight forward as it depends on personnel expertise more than mathematical formulas. To get over this limitation, this paper is aimed at improving the interpretation of Doernenberg Ratio Method, IEC Ratio Method, and Duval Triangle Method using Two Stages Adaptive Neuro-Fuzzy Inference System (ANFIS). Dissolved gas analysis data from 520 faulty transformers was analyzed to establish the proposed ANFIS model. Results show that the developed ANFIS model is accurate and can standardize the dissolved gas interpretation process with accuracy higher than 90%.

Keywords: ANFIS, dissolved gas analysis, Doernenberg ratio method, Duval triangular method, IEC ratio method, transformer

Procedia PDF Downloads 133
36097 Tapping into Debt: The Effect of Contactless Payment Methods on Overdraft Fee Occurrence

Authors: Merle Van Den Akker, Neil Stewart, Andrea Isoni

Abstract:

Contactless methods of payment referred to as tap&go, have become increasingly popular globally. However, little is known about the consequences of this payment method on spending, spending habits, personal finance management, and debt accumulation. The literature on other payment methods such as credit cards suggests that, through increased ease and reduced friction, the pain of paying in these methods is reduced, leading to higher and more frequent spending, resulting in higher debt accumulation. Within this research, we use a dataset of 300 million transactions of 165.000 individuals to see whether the onset of using contactless methods of payment increases the occurrence of overdraft fees. Using the R package MatchIt, we find, when matching people on initial overdraft occurrence and salary, that people who do start using contactless incur a significantly higher number of overdraft fees, as compared to those who do not start using contactless in the same year. Having accounted for income, opting-in, and time-of-year effects, these results show that contactless methods of payment fall within the scope of earlier theories on credit cards, such as the pain of paying, meaning that this payment method leads to increasing difficulties managing personal finance.

Keywords: contactless, debt accumulation, overdraft fees, payment methods, spending

Procedia PDF Downloads 103
36096 Understanding Consumer Behaviors by Using Neuromarketing Tools and Methods

Authors: Tabrej Khan

Abstract:

Neuromarketing can refer to the commercial application of neuroscience technologies and insights to drive business further. On the other side, consumer neuroscience can be seen as the academic use of neuroscience to better understand marketing effects on consumer behavior. Consumer Neuroscience and Neuromarketing is a multidisciplinary effort between economics, psychology, and neuroscience and information technology. Traditional methods are using survey, interviews, focus group people are overtly and consciously reporting on their experience and thoughts. The unconscious side of customer behavior is largely unmeasured in the traditional methods. Neuroscience has a potential to understand the unconscious part. Through this paper, we are going to present specific results of selected tools and methods that are used to understand consumer behaviors.

Keywords: neuromarketing, neuroscience, consumer behaviors, tools

Procedia PDF Downloads 373
36095 A Comprehensive Analysis of the Phylogenetic Signal in Ramp Sequences in 211 Vertebrates

Authors: Lauren M. McKinnon, Justin B. Miller, Michael F. Whiting, John S. K. Kauwe, Perry G. Ridge

Abstract:

Background: Ramp sequences increase translational speed and accuracy when rare, slowly-translated codons are found at the beginnings of genes. Here, the results of the first analysis of ramp sequences in a phylogenetic construct are presented. Methods: Ramp sequences were compared from 211 vertebrates (110 Mammalian and 101 non-mammalian). The presence and absence of ramp sequences were analyzed as a binary character in a parsimony and maximum likelihood framework. Additionally, ramp sequences were mapped to the Open Tree of Life taxonomy to determine the number of parallelisms and reversals that occurred, and these results were compared to what would be expected due to random chance. Lastly, aligned nucleotides in ramp sequences were compared to the rest of the sequence in order to examine possible differences in phylogenetic signal between these regions of the gene. Results: Parsimony and maximum likelihood analyses of the presence/absence of ramp sequences recovered phylogenies that are highly congruent with established phylogenies. Additionally, the retention index of ramp sequences is significantly higher than would be expected due to random chance (p-value = 0). A chi-square analysis of completely orthologous ramp sequences resulted in a p-value of approximately zero as compared to random chance. Discussion: Ramp sequences recover comparable phylogenies as other phylogenomic methods. Although not all ramp sequences appear to have a phylogenetic signal, more ramp sequences track speciation than expected by random chance. Therefore, ramp sequences may be used in conjunction with other phylogenomic approaches.

Keywords: codon usage bias, phylogenetics, phylogenomics, ramp sequence

Procedia PDF Downloads 141
36094 Effectiveness of Traditional Chinese Medicine in the Treatment of Eczema: A Systematic Review and Meta-Analysis Based on Eczema Area and Severity Index Score

Authors: Oliver Chunho Ma, Tszying Chang

Abstract:

Background: Traditional Chinese Medicine (TCM) has been widely used in the treatment of eczema. However, there is currently a lack of comprehensive research on the overall effectiveness of TCM in treating eczema, particularly using the Eczema Area and Severity Index (EASI) score as an evaluation tool. Meta-analysis can integrate the results of multiple studies to provide more convincing evidence. Objective: To conduct a systematic review and meta-analysis based on the EASI score to evaluate the overall effectiveness of TCM in the treatment of eczema. Specifically, the study will review and analyze published clinical studies that investigate TCM treatments for eczema and use the EASI score as an outcome measure, comparing the differences in improving the severity of eczema between TCM and other treatment modalities, such as conventional Western medicine treatments. Methods: Relevant studies, including randomized controlled trials (RCTs) and non-randomized controlled trials, that involve TCM treatment for eczema and use the EASI score as an outcome measure will be searched in medical literature databases such as PubMed, CNKI, etc. Relevant data will be extracted from the selected studies, including study design, sample size, treatment methods, improvement in EASI score, etc. The methodological quality and risk of bias of the included studies will be assessed using appropriate evaluation tools (such as the Cochrane Handbook). The results of the selected studies will be statistically analyzed, including pooling effect sizes (such as standardized mean differences, relative risks, etc.), subgroup analysis (e.g., different TCM syndromes, different treatment modalities), and sensitivity analysis (e.g., excluding low-quality studies). Based on the results of the statistical analysis and quality assessment, the overall effectiveness of TCM in improving the severity of eczema will be interpreted. Expected outcomes: By integrating the results of multiple studies, we expect to provide more convincing evidence regarding the specific effects of TCM in improving the severity of eczema. Additionally, subgroup analysis and sensitivity analysis can further elucidate whether the effectiveness of TCM treatment is influenced by different factors. Besides, we will compare the results of the meta-analysis with the clinical data from our clinic. For both the clinical data and the meta-analysis results, we will perform descriptive statistics such as means, standard deviations, percentages, etc. and compare the differences between the two using statistical tests such as independent samples t-test or non-parametric tests to assess the statistical differences between them.

Keywords: Eczema, traditional Chinese medicine, EASI, systematic review, meta-analysis

Procedia PDF Downloads 35
36093 Testing Causal Model of Depression Based on the Components of Subscales Lifestyle with Mediation of Social Health

Authors: Abdolamir Gatezadeh, Jamal Daghaleh

Abstract:

The lifestyle of individuals is important and determinant for the status of psychological and social health. Recently, especially in developed countries, the relationship between lifestyle and mental illnesses, including depression, has attracted the attention of many people. In order to test the causal model of depression based on lifestyle with mediation of social health in the study, basic and applied methods were used in terms of objective and descriptive-field as well as the data collection. Methods: This study is a basic research type and is in the framework of correlational plans. In this study, the population includes all adults in Ahwaz city. A randomized, multistage sampling of 384 subjects was selected as the subjects. Accordingly, the data was collected and analyzed using structural equation modeling. Results: In data analysis, path analysis indicated the confirmation of the assumed model fit of research. This means that subscales lifestyle has a direct effect on depression and subscales lifestyle through the mediation of social health which in turn has an indirect effect on depression. Discussion and conclusion: According to the results of the research, the depression can be used to explain the components of the lifestyle and social health.

Keywords: depression, subscales lifestyle, social health, causal model

Procedia PDF Downloads 147
36092 Mathematical Modeling of the AMCs Cross-Contamination Removal in the FOUPs: Finite Element Formulation and Application in FOUP’s Decontamination

Authors: N. Santatriniaina, J. Deseure, T. Q. Nguyen, H. Fontaine, C. Beitia, L. Rakotomanana

Abstract:

Nowadays, with the increasing of the wafer's size and the decreasing of critical size of integrated circuit manufacturing in modern high-tech, microelectronics industry needs a maximum attention to challenge the contamination control. The move to 300 mm is accompanied by the use of Front Opening Unified Pods for wafer and his storage. In these pods an airborne cross contamination may occur between wafers and the pods. A predictive approach using modeling and computational methods is very powerful method to understand and qualify the AMCs cross contamination processes. This work investigates the required numerical tools which are employed in order to study the AMCs cross-contamination transfer phenomena between wafers and FOUPs. Numerical optimization and finite element formulation in transient analysis were established. Analytical solution of one dimensional problem was developed and the calibration process of physical constants was performed. The least square distance between the model (analytical 1D solution) and the experimental data are minimized. The behavior of the AMCs intransient analysis was determined. The model framework preserves the classical forms of the diffusion and convection-diffusion equations and yields to consistent form of the Fick's law. The adsorption process and the surface roughness effect were also traduced as a boundary condition using the switch condition Dirichlet to Neumann and the interface condition. The methodology is applied, first using the optimization methods with analytical solution to define physical constants, and second using finite element method including adsorption kinetic and the switch of Dirichlet to Neumann condition.

Keywords: AMCs, FOUP, cross-contamination, adsorption, diffusion, numerical analysis, wafers, Dirichlet to Neumann, finite elements methods, Fick’s law, optimization

Procedia PDF Downloads 484
36091 TDApplied: An R Package for Machine Learning and Inference with Persistence Diagrams

Authors: Shael Brown, Reza Farivar

Abstract:

Persistence diagrams capture valuable topological features of datasets that other methods cannot uncover. Still, their adoption in data pipelines has been limited due to the lack of publicly available tools in R (and python) for analyzing groups of them with machine learning and statistical inference. In an easy-to-use and scalable R package called TDApplied, we implement several applied analysis methods tailored to groups of persistence diagrams. The two main contributions of our package are comprehensiveness (most functions do not have implementations elsewhere) and speed (shown through benchmarking against other R packages). We demonstrate applications of the tools on simulated data to illustrate how easily practical analyses of any dataset can be enhanced with topological information.

Keywords: machine learning, persistence diagrams, R, statistical inference

Procedia PDF Downloads 62
36090 From Text to Data: Sentiment Analysis of Presidential Election Political Forums

Authors: Sergio V Davalos, Alison L. Watkins

Abstract:

User generated content (UGC) such as website post has data associated with it: time of the post, gender, location, type of device, and number of words. The text entered in user generated content (UGC) can provide a valuable dimension for analysis. In this research, each user post is treated as a collection of terms (words). In addition to the number of words per post, the frequency of each term is determined by post and by the sum of occurrences in all posts. This research focuses on one specific aspect of UGC: sentiment. Sentiment analysis (SA) was applied to the content (user posts) of two sets of political forums related to the US presidential elections for 2012 and 2016. Sentiment analysis results in deriving data from the text. This enables the subsequent application of data analytic methods. The SASA (SAIL/SAI Sentiment Analyzer) model was used for sentiment analysis. The application of SASA resulted with a sentiment score for each post. Based on the sentiment scores for the posts there are significant differences between the content and sentiment of the two sets for the 2012 and 2016 presidential election forums. In the 2012 forums, 38% of the forums started with positive sentiment and 16% with negative sentiment. In the 2016 forums, 29% started with positive sentiment and 15% with negative sentiment. There also were changes in sentiment over time. For both elections as the election got closer, the cumulative sentiment score became negative. The candidate who won each election was in the more posts than the losing candidates. In the case of Trump, there were more negative posts than Clinton’s highest number of posts which were positive. KNIME topic modeling was used to derive topics from the posts. There were also changes in topics and keyword emphasis over time. Initially, the political parties were the most referenced and as the election got closer the emphasis changed to the candidates. The performance of the SASA method proved to predict sentiment better than four other methods in Sentibench. The research resulted in deriving sentiment data from text. In combination with other data, the sentiment data provided insight and discovery about user sentiment in the US presidential elections for 2012 and 2016.

Keywords: sentiment analysis, text mining, user generated content, US presidential elections

Procedia PDF Downloads 167
36089 Polycode Texts in Communication of Antisocial Groups: Functional and Pragmatic Aspects

Authors: Ivan Potapov

Abstract:

Background: The aim of this paper is to investigate poly code texts in the communication of youth antisocial groups. Nowadays, the notion of a text has numerous interpretations. Besides all the approaches to defining a text, we must take into account semiotic and cultural-semiotic ones. Rapidly developing IT, world globalization, and new ways of coding of information increase the role of the cultural-semiotic approach. However, the development of computer technologies leads also to changes in the text itself. Polycode texts play a more and more important role in the everyday communication of the younger generation. Therefore, the research of functional and pragmatic aspects of both verbal and non-verbal content is actually quite important. Methods and Material: For this survey, we applied the combination of four methods of text investigation: not only intention and content analysis but also semantic and syntactic analysis. Using these methods provided us with information on general text properties, the content of transmitted messages, and each communicants’ intentions. Besides, during our research, we figured out the social background; therefore, we could distinguish intertextual connections between certain types of polycode texts. As the sources of the research material, we used 20 public channels in the popular messenger Telegram and data extracted from smartphones, which belonged to arrested members of antisocial groups. Findings: This investigation let us assert that polycode texts can be characterized as highly intertextual language unit. Moreover, we could outline the classification of these texts based on communicants’ intentions. The most common types of antisocial polycode texts are a call to illegal actions and agitation. What is more, each type has its own semantic core: it depends on the sphere of communication. However, syntactic structure is universal for most of the polycode texts. Conclusion: Polycode texts play important role in online communication. The results of this investigation demonstrate that in some social groups using these texts has a destructive influence on the younger generation and obviously needs further researches.

Keywords: text, polycode text, internet linguistics, text analysis, context, semiotics, sociolinguistics

Procedia PDF Downloads 114
36088 Teaching Academic Vocabulary: A Recent and Old Approach

Authors: Sara Fine-Meltzer

Abstract:

An obvious, but ill-addressed hindrance to reading comprehension in academic English is poor vocabulary. Unfortunately, dealing with the problem is usually delayed until university entrance. It is the contention of this paper that the chore should be confronted much earlier and by using a very old-fashioned method. This presentation is accompanied by vocabulary lists for advanced level university students with explanations concerning the content and justification for the 500-word lists: how they change over time in accordance with evolving styles of academic writing. There are also sample quizzes and methods to ensure that the words are “absorbed” over time. There is a discussion of other vocabulary acquisition methods and conclusions drawn from the drawbacks of such methods. The paper concludes with the rationale for beginning the study of “academic” vocabulary earlier than is generally acceptable.

Keywords: academic vocabulary, old-fashioned methods, quizzes, vocabulary lists

Procedia PDF Downloads 104
36087 Stability Analysis of Slopes during Pile Driving

Authors: Yeganeh Attari, Gudmund Reidar Eiksund, Hans Peter Jostad

Abstract:

In Geotechnical practice, there is no standard method recognized by the industry to account for the reduction of safety factor of a slope as an effect of soil displacement and pore pressure build-up during pile installation. Pile driving disturbs causes large strains and generates excess pore pressures in a zone that can extend many diameters from the installed pile, resulting in a decrease of the shear strength of the surrounding soil. This phenomenon may cause slope failure. Moreover, dissipation of excess pore pressure set-up may cause weakening of areas outside the volume of soil remoulded during installation. Because of complex interactions between changes in mean stress and shearing, it is challenging to predict installation induced pore pressure response. Furthermore, it is a complex task to follow the rate and path of pore pressure dissipation in order to analyze slope stability. In cohesive soils it is necessary to implement soil models that account for strain softening in the analysis. In the literature, several cases of slope failure due to pile driving activities have been reported, for instance, a landslide in Gothenburg that resulted in a slope failure destroying more than thirty houses and Rigaud landslide in Quebec which resulted in loss of life. Up to now, several methods have been suggested to predict the effect of pile driving on total and effective stress, pore pressure changes and their effect on soil strength. However, this is still not well understood or agreed upon. In Norway, general approaches applied by geotechnical engineers for this problem are based on old empirical methods with little accurate theoretical background. While the limitations of such methods are discussed, this paper attempts to capture the reduction in the factor of safety of a slope during pile driving, using coupled Finite Element analysis and cavity expansion method. This is demonstrated by analyzing a case of slope failure due to pile driving in Norway.

Keywords: cavity expansion method, excess pore pressure, pile driving, slope failure

Procedia PDF Downloads 131
36086 Application First and Second Digits Number in the Benford Law

Authors: Teguh Sugiarto

Abstract:

Background: This study aims to explore the fraud that occurred in the financial statements using the Benford distribution law of 1st and 2nd case study of PT AKR Corporindo Tbk. Research Methods: In this study the authors use the first digit of the analysis and the analysis of the second digit of Bedford’s law. Having obtained the results of the analysis of the first and second digits, authors will make the difference between implementations using the scale above and below 5%. The number that has the level of difference in the range of 5% above or below, then a financial report in may, to analyse in the followup to the direction of the audit investigation, and authors assume happens a confusion in the financial statements. Findings: From research done, we found that there was a difference in the results of the appearance of the first digit of the number with the proper use of Benford's law, according to PT AKR Corporindo financial reports Tbk for the fiscal year 2006-2010, above and below the level the difference in set 5%. Conclusions: From the research that has been done, it can be concluded that on PT AKR Corporindo financial report 2006, 2007, 2008, 2009 and 2010, there is a level difference of appearance of numbers according to Benford's law is significant, as presented in the table analysis.

Keywords: Benford law, first digits, second digits, Indonesian company

Procedia PDF Downloads 412
36085 Design and Finite Element Analysis of Clamp Cylinder for Capacity Augmentation of Injection Moulding Machine

Authors: Vimal Jasoliya, Purnank Bhatt, Mit Shah

Abstract:

The Injection Moulding is one of the principle methods of conversions of plastics into various end products using a very wide range of plastics materials from commodity plastics to specialty engineering plastics. Injection Moulding Machines are rated as per the tonnage force applied. The work present includes Design & Finite Element Analysis of a structure component of injection moulding machine i.e. clamp cylinder. The work of the project is to upgrade the 1300T clamp cylinder to 1500T clamp cylinder for injection moulding machine. The design of existing clamp cylinder of 1300T is checked. Finite Element analysis is carried out for 1300T clamp cylinder in ANSYS Workbench, and the stress values are compared with acceptance criteria and theoretical calculation. The relation between the clamp cylinder diameter and the tonnage capacity has been derived and verified for 1300T clamp cylinder. The same correlation is used to find out the thickness for 1500T clamp cylinder. The detailed design of 1500T cylinder is carried out based on calculated thickness.

Keywords: clamp cylinder, fatigue analysis, finite element analysis, injection moulding machines

Procedia PDF Downloads 317
36084 A Method of Detecting the Difference in Two States of Brain Using Statistical Analysis of EEG Raw Data

Authors: Digvijaysingh S. Bana, Kiran R. Trivedi

Abstract:

This paper introduces various methods for the alpha wave to detect the difference between two states of brain. One healthy subject participated in the experiment. EEG was measured on the forehead above the eye (FP1 Position) with reference and ground electrode are on the ear clip. The data samples are obtained in the form of EEG raw data. The time duration of reading is of one minute. Various test are being performed on the alpha band EEG raw data.The readings are performed in different time duration of the entire day. The statistical analysis is being carried out on the EEG sample data in the form of various tests.

Keywords: electroencephalogram(EEG), biometrics, authentication, EEG raw data

Procedia PDF Downloads 448
36083 Analysis on the Satisfaction of University-Industry Collaboration

Authors: Jeonghwan Jeon

Abstract:

Recently, the industry and academia have been planning development through industry/university cooperation (IUC), and the government has been promoting alternative methods to achieve successful IUC. Representatively, business cultivation involves the lead university (regarding IUC), research and development (R&D), company support, professional manpower cultivation, and marketing, etc., and the scale of support expands every year. Research is performed by many academic researchers to achieve IUC and although satisfaction of their results is high, expectations are not being met and study of the main factor is insufficient. Therefore, this research improves on theirs by analysing the main factors influencing their satisfaction. Each factor is analysed by AHP, and portfolio analysis is performed on the importance and current satisfaction level. This will help improve satisfaction of business participants and ensure effective IUC in the future.

Keywords: industry/university cooperation, satisfaction, portfolio analysis, business participant

Procedia PDF Downloads 480
36082 Anomaly Detection in Financial Markets Using Tucker Decomposition

Authors: Salma Krafessi

Abstract:

The financial markets have a multifaceted, intricate environment, and enormous volumes of data are produced every day. To find investment possibilities, possible fraudulent activity, and market oddities, accurate anomaly identification in this data is essential. Conventional methods for detecting anomalies frequently fail to capture the complex organization of financial data. In order to improve the identification of abnormalities in financial time series data, this study presents Tucker Decomposition as a reliable multi-way analysis approach. We start by gathering closing prices for the S&P 500 index across a number of decades. The information is converted to a three-dimensional tensor format, which contains internal characteristics and temporal sequences in a sliding window structure. The tensor is then broken down using Tucker Decomposition into a core tensor and matching factor matrices, allowing latent patterns and relationships in the data to be captured. A possible sign of abnormalities is the reconstruction error from Tucker's Decomposition. We are able to identify large deviations that indicate unusual behavior by setting a statistical threshold. A thorough examination that contrasts the Tucker-based method with traditional anomaly detection approaches validates our methodology. The outcomes demonstrate the superiority of Tucker's Decomposition in identifying intricate and subtle abnormalities that are otherwise missed. This work opens the door for more research into multi-way data analysis approaches across a range of disciplines and emphasizes the value of tensor-based methods in financial analysis.

Keywords: tucker decomposition, financial markets, financial engineering, artificial intelligence, decomposition models

Procedia PDF Downloads 40
36081 Reading Literacy and Methods of Improving Reading

Authors: Iva Košek Bartošová, Andrea Jokešová, Eva Kozlová, Helena Matějová

Abstract:

The paper presents results of a research team from Faculty of Education, University of Hradec Králové in the Czech Republic. It introduces with the most reading methods used in the 1st classes of a primary school and presents results of a pilot research focused on mastering reading techniques and the quality of reading comprehension of pupils in the first half of a school year during training in teaching reading by an analytic-synthetic method and by a genetic method. These methods of practicing reading skills are the most used ones in the Czech Republic. During the school year 2015/16 there has been a measurement made of two groups of pupils of the 1st year and monitoring of quantitative and qualitative parameters of reading pupils’ outputs by several methods. Both of these methods are based on different theoretical basis and each of them has a specific educational and methodical procedure. This contribution represents results during a piloting project and draws pilot conclusions which will be verified in the subsequent broader research at the end of the school year of the first class of primary school.

Keywords: analytic-synthetic method of reading, genetic method of reading, reading comprehension, reading literacy, reading methods, reading speed

Procedia PDF Downloads 238
36080 Information Visualization Methods Applied to Nanostructured Biosensors

Authors: Osvaldo N. Oliveira Jr.

Abstract:

The control of molecular architecture inherent in some experimental methods to produce nanostructured films has had great impact on devices of various types, including sensors and biosensors. The self-assembly monolayers (SAMs) and the electrostatic layer-by-layer (LbL) techniques, for example, are now routinely used to produce tailored architectures for biosensing where biomolecules are immobilized with long-lasting preserved activity. Enzymes, antigens, antibodies, peptides and many other molecules serve as the molecular recognition elements for detecting an equally wide variety of analytes. The principles of detection are also varied, including electrochemical methods, fluorescence spectroscopy and impedance spectroscopy. In this presentation an overview will be provided of biosensors made with nanostructured films to detect antibodies associated with tropical diseases and HIV, in addition to detection of analytes of medical interest such as cholesterol and triglycerides. Because large amounts of data are generated in the biosensing experiments, use has been made of computational and statistical methods to optimize performance. Multidimensional projection techniques such as Sammon´s mapping have been shown more efficient than traditional multivariate statistical analysis in identifying small concentrations of anti-HIV antibodies and for distinguishing between blood serum samples of animals infected with two tropical diseases, namely Chagas´ disease and Leishmaniasis. Optimization of biosensing may include a combination of another information visualization method, the Parallel Coordinate technique, with artificial intelligence methods in order to identify the most suitable frequencies for reaching higher sensitivity using impedance spectroscopy. Also discussed will be the possible convergence of technologies, through which machine learning and other computational methods may be used to treat data from biosensors within an expert system for clinical diagnosis.

Keywords: clinical diagnosis, information visualization, nanostructured films, layer-by-layer technique

Procedia PDF Downloads 314
36079 Rapid, Direct, Real-Time Method for Bacteria Detection on Surfaces

Authors: Evgenia Iakovleva, Juha Koivisto, Pasi Karppinen, J. Inkinen, Mikko Alava

Abstract:

Preventing the spread of infectious diseases throughout the worldwide is one of the most important tasks of modern health care. Infectious diseases not only account for one fifth of the deaths in the world, but also cause many pathological complications for the human health. Touch surfaces pose an important vector for the spread of infections by varying microorganisms, including antimicrobial resistant organisms. Further, antimicrobial resistance is reply of bacteria to the overused or inappropriate used of antibiotics everywhere. The biggest challenges in bacterial detection by existing methods are non-direct determination, long time of analysis, the sample preparation, use of chemicals and expensive equipment, and availability of qualified specialists. Therefore, a high-performance, rapid, real-time detection is demanded in rapid practical bacterial detection and to control the epidemiological hazard. Among the known methods for determining bacteria on the surfaces, Hyperspectral methods can be used as direct and rapid methods for microorganism detection on different kind of surfaces based on fluorescence without sampling, sample preparation and chemicals. The aim of this study was to assess the relevance of such systems to remote sensing of surfaces for microorganisms detection to prevent a global spread of infectious diseases. Bacillus subtilis and Escherichia coli with different concentrations (from 0 to 10x8 cell/100µL) were detected with hyperspectral camera using different filters as visible visualization of bacteria and background spots on the steel plate. A method of internal standards was applied for monitoring the correctness of the analysis results. Distances from sample to hyperspectral camera and light source are 25 cm and 40 cm, respectively. Each sample is optically imaged from the surface by hyperspectral imaging system, utilizing a JAI CM-140GE-UV camera. Light source is BeamZ FLATPAR DMX Tri-light, 3W tri-colour LEDs (red, blue and green). Light colors are changed through DMX USB Pro interface. The developed system was calibrated following a standard procedure of setting exposure and focused for light with λ=525 nm. The filter is ThorLabs KuriousTM hyperspectral filter controller with wavelengths from 420 to 720 nm. All data collection, pro-processing and multivariate analysis was performed using LabVIEW and Python software. The studied human eye visible and invisible bacterial stains clustered apart from a reference steel material by clustering analysis using different light sources and filter wavelengths. The calculation of random and systematic errors of the analysis results proved the applicability of the method in real conditions. Validation experiments have been carried out with photometry and ATP swab-test. The lower detection limit of developed method is several orders of magnitude lower than for both validation methods. All parameters of the experiments were the same, except for the light. Hyperspectral imaging method allows to separate not only bacteria and surfaces, but also different types of bacteria, such as Gram-negative Escherichia coli and Gram-positive Bacillus subtilis. Developed method allows skipping the sample preparation and the use of chemicals, unlike all other microbiological methods. The time of analysis with novel hyperspectral system is a few seconds, which is innovative in the field of microbiological tests.

Keywords: Escherichia coli, Bacillus subtilis, hyperspectral imaging, microorganisms detection

Procedia PDF Downloads 203