Search results for: permittivity measurement techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9088

Search results for: permittivity measurement techniques

8428 Engineering Method to Measure the Impact Sound Improvement with Floor Coverings

Authors: Katarzyna Baruch, Agata Szelag, Jaroslaw Rubacha, Bartlomiej Chojnacki, Tadeusz Kamisinski

Abstract:

Methodology used to measure the reduction of transmitted impact sound by floor coverings situated on a massive floor is described in ISO 10140-3: 2010. To carry out such tests, the standardised reverberation room separated by a standard floor from the second measuring room are required. The need to have a special laboratory results in high cost and low accessibility of this measurement. The authors propose their own engineering method to measure the impact sound improvement with floor coverings. This method does not require standard rooms and floor. This paper describes the measurement procedure of proposed engineering method. Further, verification tests were performed. Validation of the proposed method was based on the analytical model, Statistical Energy Analysis (SEA) model and empirical measurements. The received results were related to corresponding ones obtained from ISO 10140-3:2010 measurements. The study confirmed the usefulness of the engineering method.

Keywords: building acoustic, impact noise, impact sound insulation, impact sound transmission, reduction of impact sound

Procedia PDF Downloads 321
8427 Towards Modern Approaches of Intelligence Measurement for Clinical and Educational Practices

Authors: Alena Kulikova, Tatjana Kanonire

Abstract:

Intelligence research is one of the oldest fields of psychology. Many factors have made a research on intelligence, defined as reasoning and problem solving [1, 2], a very acute and urgent problem. Thus, it has been repeatedly shown that intelligence is a predictor of academic, professional, and social achievement in adulthood (for example, [3]); Moreover, intelligence predicts these achievements better than any other trait or ability [4]. The individual level, a comprehensive assessment of intelligence is a necessary criterion for the diagnosis of various mental conditions. For example, it is a necessary condition for psychological, medical and pedagogical commissions when deciding on educational needs and the most appropriate educational programs for school children. Assessment of intelligence is crucial in clinical psychodiagnostic and needs high-quality intelligence measurement tools. Therefore, it is not surprising that the development of intelligence tests is an essential part of psychological science and practice. Many modern intelligence tests have a long history and have been used for decades, for example, the Stanford-Binet test or the Wechsler test. However, the vast majority of these tests are based on the classic linear test structure, in which all respondents receive all tasks (see, for example, a critical review by [5]). This understanding of the testing procedure is a legacy of the pre-computer era, in which blank testing was the only diagnostic procedure available [6] and has some significant limitations that affect the reliability of the data obtained [7] and increased time costs. Another problem with measuring IQ is that classical line-structured tests do not fully allow to measure respondent's intellectual progress [8], which is undoubtedly a critical limitation. Advances in modern psychometrics allow for avoiding the limitations of existing tools. However, as in any rapidly developing industry, at the moment, psychometrics does not offer ready-made and straightforward solutions and requires additional research. In our presentation we would like to discuss the strengths and weaknesses of the current approaches to intelligence measurement and highlight “points of growth” for creating a test in accordance with modern psychometrics. Whether it is possible to create the instrument that will use all achievements of modern psychometric and remain valid and practically oriented. What would be the possible limitations for such an instrument? The theoretical framework and study design to create and validate the original Russian comprehensive computer test for measuring the intellectual development in school-age children will be presented.

Keywords: Intelligence, psychometrics, psychological measurement, computerized adaptive testing, multistage testing

Procedia PDF Downloads 74
8426 Coordinated Multi-Point Scheme Based on Channel State Information in MIMO-OFDM System

Authors: Su-Hyun Jung, Chang-Bin Ha, Hyoung-Kyu Song

Abstract:

Recently, increasing the quality of experience (QoE) is an important issue. Since performance degradation at cell edge extremely reduces the QoE, several techniques are defined at LTE/LTE-A standard to remove inter-cell interference (ICI). However, the conventional techniques have disadvantage because there is a trade-off between resource allocation and reliable communication. The proposed scheme reduces the ICI more efficiently by using channel state information (CSI) smartly. It is shown that the proposed scheme can reduce the ICI with less resources.

Keywords: adaptive beamforming, CoMP, LTE-A, ICI reduction

Procedia PDF Downloads 464
8425 Analyzing Test Data Generation Techniques Using Evolutionary Algorithms

Authors: Arslan Ellahi, Syed Amjad Hussain

Abstract:

Software Testing is a vital process in software development life cycle. We can attain the quality of software after passing it through software testing phase. We have tried to find out automatic test data generation techniques that are a key research area of software testing to achieve test automation that can eventually decrease testing time. In this paper, we review some of the approaches presented in the literature which use evolutionary search based algorithms like Genetic Algorithm, Particle Swarm Optimization (PSO), etc. to validate the test data generation process. We also look into the quality of test data generation which increases or decreases the efficiency of testing. We have proposed test data generation techniques for model-based testing. We have worked on tuning and fitness function of PSO algorithm.

Keywords: search based, evolutionary algorithm, particle swarm optimization, genetic algorithm, test data generation

Procedia PDF Downloads 185
8424 A Decision Support Framework for Introducing Business Intelligence to Midlands Based SMEs

Authors: Amritpal Slaich, Mark Elshaw

Abstract:

This paper explores the development of a decision support framework for the introduction of business intelligence (BI) through operational research techniques for application by SMEs. Aligned with the goals of the new Midlands Enterprise Initiative of improving the skill levels of the Midlands workforce and addressing high levels of regional unemployment, we have developed a framework to increase the level of business intelligence used by SMEs to improve business decision-making. Many SMEs in the Midlands fail due to the lack of high quality decision making. Our framework outlines how universities can: engage with SMEs in the use of BI through operational research techniques; develop appropriate and easy to use Excel spreadsheet models; and make use of a process to allow SMEs to feedback their findings of the models. Future work will determine how well the framework performs in getting SMEs to apply BI to improve their decision-making performance.

Keywords: SMEs, decision support framework, business intelligence, operational research techniques

Procedia PDF Downloads 463
8423 Technical Realization of Key Aesthetic Principles in Guzheng Performance

Authors: Qiongzi Zheng, Lewis Cornwell, Neal Peres Da Costa

Abstract:

Drawn on Confucian and Taoist philosophy and long-established tradition of aesthetic ideals, the Art of the Chinese Zither (Xishan Qinkuang), a classic work by Chinese music scholar Xu Shangyin in 1643, distilled twenty-four practicing principles for the Chinese zither. This work has influenced the practice of guzheng to the present day. Whilst the principles were described in detail, how they can actually be achieved on a technical level remains to be explored. This study focuses on three key practicing principles: yuan (roundness), liu (fluidness), and su (swiftness), and examines how the playing techniques developed by Master Zhao Manqin contribute to the implementation of the principles. The study concludes that knowledge of the technicality of fingering positioning before and after plucking motion is critical to the realization of these principles.

Keywords: Chinese music aesthetics, practicing principles of the Chinese zither, guzheng playing techniques, Zhao Manqin’s fingering techniques, Xishan Qinkuang

Procedia PDF Downloads 176
8422 Resume Ranking Using Custom Word2vec and Rule-Based Natural Language Processing Techniques

Authors: Subodh Chandra Shakya, Rajendra Sapkota, Aakash Tamang, Shushant Pudasaini, Sujan Adhikari, Sajjan Adhikari

Abstract:

Lots of efforts have been made in order to measure the semantic similarity between the text corpora in the documents. Techniques have been evolved to measure the similarity of two documents. One such state-of-art technique in the field of Natural Language Processing (NLP) is word to vector models, which converts the words into their word-embedding and measures the similarity between the vectors. We found this to be quite useful for the task of resume ranking. So, this research paper is the implementation of the word2vec model along with other Natural Language Processing techniques in order to rank the resumes for the particular job description so as to automate the process of hiring. The research paper proposes the system and the findings that were made during the process of building the system.

Keywords: chunking, document similarity, information extraction, natural language processing, word2vec, word embedding

Procedia PDF Downloads 153
8421 Enhancing Goal Achievement through Improved Communication Skills

Authors: Lin Xie, Yang Wang

Abstract:

An extensive body of research studies suggest that students, teachers, and supervisors can enhance the likelihood of reaching their goals by improving their communication skills. It is highly important to learn how and when to provide different kinds of feedback, e.g. anticipatory, corrective and positive) will gain better result and higher morale. The purpose of this mixed methods research is twofold: 1) To find out what factors affect effective communication among different stakeholders and how these factors affect student learning2) What are the good practices for improving communication among different stakeholders and improve student achievement. This presentation first begins with an introduction to the recent research on Marshall’s Nonviolent Communication Techniques (NVC), including four important components: observations, feelings, needs, requests. These techniques can be effectively applied at all levels of communication. To develop an in-depth understanding of the relationship among different techniques within, this research collected, compared, and combined qualitative and quantitative data to better improve communication and support student learning.

Keywords: education, communication, psychology, student learning, language teaching

Procedia PDF Downloads 46
8420 Machine Translation Analysis of Chinese Dish Names

Authors: Xinyu Zhang, Olga Torres-Hostench

Abstract:

This article presents a comparative study evaluating and comparing the quality of machine translation (MT) output of Chinese gastronomy nomenclature. Chinese gastronomic culture is experiencing an increased international acknowledgment nowadays. The nomenclature of Chinese gastronomy not only reflects a specific aspect of culture, but it is related to other areas of society such as philosophy, traditional medicine, etc. Chinese dish names are composed of several types of cultural references, such as ingredients, colors, flavors, culinary techniques, cooking utensils, toponyms, anthroponyms, metaphors, historical tales, among others. These cultural references act as one of the biggest difficulties in translation, in which the use of translation techniques is usually required. Regarding the lack of Chinese food-related translation studies, especially in Chinese-Spanish translation, and the current massive use of MT, the quality of the MT output of Chinese dish names is questioned. Fifty Chinese dish names with different types of cultural components were selected in order to complete this study. First, all of these dish names were translated by three different MT tools (Google Translate, Baidu Translate and Bing Translator). Second, a questionnaire was designed and completed by 12 Chinese online users (Chinese graduates of a Hispanic Philology major) in order to find out user preferences regarding the collected MT output. Finally, human translation techniques were observed and analyzed to identify what translation techniques would be observed more often in the preferred MT proposals. The result reveals that the MT output of the Chinese gastronomy nomenclature is not of high quality. It would be recommended not to trust the MT in occasions like restaurant menus, TV culinary shows, etc. However, the MT output could be used as an aid for tourists to have a general idea of a dish (the main ingredients, for example). Literal translation turned out to be the most observed technique, followed by borrowing, generalization and adaptation, while amplification, particularization and transposition were infrequently observed. Possibly because that the MT engines at present are limited to relate equivalent terms and offer literal translations without taking into account the whole context meaning of the dish name, which is essential to the application of those less observed techniques. This could give insight into the post-editing of the Chinese dish name translation. By observing and analyzing translation techniques in the proposals of the machine translators, the post-editors could better decide which techniques to apply in each case so as to correct mistakes and improve the quality of the translation.

Keywords: Chinese dish names, cultural references, machine translation, translation techniques

Procedia PDF Downloads 131
8419 Musical Instruments Classification Using Machine Learning Techniques

Authors: Bhalke D. G., Bormane D. S., Kharate G. K.

Abstract:

This paper presents classification of musical instrument using machine learning techniques. The classification has been carried out using temporal, spectral, cepstral and wavelet features. Detail feature analysis is carried out using separate and combined features. Further, instrument model has been developed using K-Nearest Neighbor and Support Vector Machine (SVM). Benchmarked McGill university database has been used to test the performance of the system. Experimental result shows that SVM performs better as compared to KNN classifier.

Keywords: feature extraction, SVM, KNN, musical instruments

Procedia PDF Downloads 476
8418 Effects of Foam Rolling with Different Application Volumes on the Isometric Force of the Calf Muscle with Consideration of Muscle Activity

Authors: T. Poppendieker, H. Maurer, C. Segieth

Abstract:

Over the past ten years, foam rolling has become a new trend in the fitness and health market. It is also a frequently used technique for self-massage. However, the scope of effects from foam rolling has only recently started to be researched and understood. The focus of this study is to examine the effects of prolonged foam rolling on muscle performance. Isometric muscle force was used as a parameter to determine an improving impact of the myofascial roller in two different application volumes. Besides the maximal muscle force, data were also collected on muscle activation during all tests. Twenty-four (17 females, 7 males) healthy students with an average age of 23.4 ± 2.8 years were recruited. The study followed a cross-over pre-/post design in which the order of conditions was counterbalanced. The subjects performed a one-minute and three-minute foam rolling application set on two separate days. Isometric maximal muscle force of the dominant calf was tested before and after the self-myofascial release application. The statistic software program SPSS 22 was used to analyze the data of the maximal isometric force of the calf muscle by a 2 x 2 (time of measurement x intervention) analysis of variance with repeated measures. The statistic significance level was set at p ≤ 0.05. Neither for the main effect of time of measurement (F(1,23) = .93, p = .36, f = .20) nor for the interaction of time of measurement x intervention (F(1,23) = 1.99, p = .17, f = 0.29) significant p-values were found. However, the effect size indicates a mean interaction effect with a tendency of greater pre-post improvements under the three-minute foam rolling condition. Changes in maximal force did not correlate with changes in EMG-activity (r = .02, p = .95 in the short and r = -.11, p = .65 in the long rolling condition). Results support findings of previous studies and suggest a positive potential for use of the foam roll as a means for keeping muscle force at least at the same performance level while leading to an increase in flexibility.

Keywords: application volume differences, foam rolling, isometric maximal force, self-myofascial release

Procedia PDF Downloads 285
8417 Choral Singers' Preference for Expressive Priming Techniques

Authors: Shawn Michael Condon

Abstract:

Current research on teaching expressivity mainly involves instrumentalists. This study focuses on choral singers’ preference of priming techniques based on four methods for teaching expressivity. 112 choral singers answered the survey about their preferred methods for priming expressivity (vocal modelling, using metaphor, tapping into felt emotions, and drawing on past experiences) in three conditions (active, passive, and instructor). Analysis revealed higher preference for drawing on past experience among more experienced singers. The most preferred technique in the passive and instructor roles was vocal modelling, with metaphors and tapping into felt emotions favoured in an active role. Priming techniques are often used in combination with other methods to enhance singing technique or expressivity and are dependent upon the situation, repertoire, and the preferences of the instructor and performer.

Keywords: emotion, expressivity, performance, singing, teaching

Procedia PDF Downloads 151
8416 Investigation of Dynamic Characteristic of Planetary Gear Set Based On Three-Axes Torque Measurement

Authors: Masao Nakagawa, Toshiki Hirogaki, Eiichi Aoyama, Mohamed Ali Ben Abbes

Abstract:

A planetary gear set is widely used in hybrid vehicles as the power distribution system or in electric vehicles as the high reduction system, but due to its complexity with planet gears, its dynamic characteristic is not fully understood. There are many reports on two-axes driving or displacement of the planet gears under these conditions, but only few reports deal with three-axes driving. A three-axes driving condition is tested using three-axes torque measurement and focuses on the dynamic characteristic around the planet gears in this report. From experimental result, it was confirmed that the transition forces around the planet gears were balanced and the torques were also balanced around the instantaneous rotation center. The meshing frequency under these conditions was revealed to be the harmonics of two meshing frequencies; meshing frequency of the ring gear and that of the planet gears. The input power of the ring gear is distributed to the carrier and the sun gear in the dynamic sequential change of three fixed conditions; planet, star and solar modes.

Keywords: dynamic characteristic, gear, planetary gear set, torque measuring

Procedia PDF Downloads 377
8415 Defects Classification of Stator Coil Generators by Phase Resolve Partial Discharge

Authors: Chun-Yao Lee, Nando Purba, Benny Iskandar

Abstract:

This paper proposed a phase resolve partial discharge (PRPD) shape method to classify types of defect stator coil generator by using off-line PD measurement instrument. The recorded PRPD, by using the instruments MPD600, can illustrate the PRPD patterns of partial discharge of unit’s defects. In the paper, two of large units, No.2 and No.3, in Inalum hydropower plant, North Sumatera, Indonesia is adopted in the experimental measurement. The proposed PRPD shape method is to mark auxiliary lines on the PRPD patterns. The shapes of PRPD from two units are marked with the proposed method. Then, four types of defects in IEC 60034-27 standard is adopted to classify the defect types of the two units, which types are microvoids (S1), delamination tape layer (S2), slot defect (S3) and internal delamination (S4). Finally, the two units are actually inspected to validate the availability of the proposed PRPD shape method.

Keywords: partial discharge (PD), stator coil, defect, phase resolve pd (PRPD)

Procedia PDF Downloads 254
8414 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 122
8413 Analysis and Rule Extraction of Coronary Artery Disease Data Using Data Mining

Authors: Rezaei Hachesu Peyman, Oliyaee Azadeh, Salahzadeh Zahra, Alizadeh Somayyeh, Safaei Naser

Abstract:

Coronary Artery Disease (CAD) is one major cause of disability in adults and one main cause of death in developed. In this study, data mining techniques including Decision Trees, Artificial neural networks (ANNs), and Support Vector Machine (SVM) analyze CAD data. Data of 4948 patients who had suffered from heart diseases were included in the analysis. CAD is the target variable, and 24 inputs or predictor variables are used for the classification. The performance of these techniques is compared in terms of sensitivity, specificity, and accuracy. The most significant factor influencing CAD is chest pain. Elderly males (age > 53) have a high probability to be diagnosed with CAD. SVM algorithm is the most useful way for evaluation and prediction of CAD patients as compared to non-CAD ones. Application of data mining techniques in analyzing coronary artery diseases is a good method for investigating the existing relationships between variables.

Keywords: classification, coronary artery disease, data-mining, knowledge discovery, extract

Procedia PDF Downloads 655
8412 Noise Removal Techniques in Medical Images

Authors: Amhimmid Mohammed Saffour, Abdelkader Salama

Abstract:

Filtering is a part of image enhancement techniques, it is used to enhance certain details such as edges in the image that are relevant to the application. Additionally, filtering can even be used to eliminate unwanted components of noise. Medical images typically contain salt and pepper noise and Poisson noise. This noise appears to the presence of minute grey scale variations within the image. In this paper, different filters techniques namely (Median, Wiener, Rank order3, Rank order5, and Average) were applied on CT medical images (Brain and chest). We using all these filters to remove salt and pepper noise from these images. This type of noise consists of random pixels being set to black or white. Peak Signal to Noise Ratio (PSNR), Mean Square Error r(MSE) and Histogram were used to evaluated the quality of filtered images. The results, which we have achieved shows that, these filters, are more useful and they prove to be helpful for general medical practitioners to analyze the symptoms of the patients with no difficulty.

Keywords: CT imaging, median filter, adaptive filter and average filter, MATLAB

Procedia PDF Downloads 309
8411 Soil Properties and Yam Performance as Influenced by Poultry Manure and Tillage on an Alfisol in Southwestern Nigeria

Authors: E. O. Adeleye

Abstract:

Field experiments were conducted to investigate the effect of soil tillage techniques and poultry manure application on the soil properties and yam (Dioscorea rotundata) performance in Ondo, southwestern Nigeria for two farming seasons. Five soil tillage techniques, namely ploughing (P), ploughing plus harrowing (PH), manual ridging (MR), manual heaping (MH) and zero-tillage (ZT) each combined with and without poultry manure at the rate of 10 tha-1 were investigated. Data were obtained on soil properties, nutrient uptake, growth and yield of yam. Soil moisture content, bulk density, total porosity and post harvest soil chemical characteristics were significantly (p>0.05) influenced by soil tillage-manure treatments. Addition of poultry manure to the tillage techniques in the study increased soil total porosity, soil moisture content and reduced soil bulk density. Poultry manure improved soil organic matter, total nitrogen, available phosphorous, exchangeable Ca, k, leaf nutrients content of yam, yam growth and tuber yield relative to tillage techniques plots without poultry manure application. It is concluded that the possible deleterious effect of tillage on soil properties, growth and yield of yam on an alfisol in southwestern Nigeria can be reduced by combining tillage with poultry manure.

Keywords: poultry manure, tillage, soil chemical properties, yield

Procedia PDF Downloads 442
8410 Development of a Finite Element Model of the Upper Cervical Spine to Evaluate the Atlantoaxial Fixation Techniques

Authors: Iman Zafarparandeh, Muzammil Mumtaz, Paniz Taherzadeh, Deniz Erbulut

Abstract:

The instability in the atlantoaxial joint may occur due to cervical surgery, congenital anomalies, and trauma. There are different types of fixation techniques proposed for restoring the stability and preventing harmful neurological deterioration. Application of the screw constructs has become a popular alternative to the older techniques for stabilizing the joint. The main difference between the various screw constructs is the type of the screw which can be lateral mass screw, pedicle screw, transarticular screw, and translaminar screw. The aim of this paper is to study the effect of three popular screw constructs fixation techniques on the biomechanics of the atlantoaxial joint using the finite element (FE) method. A three-dimensional FE model of the upper cervical spine including the skull, C1 and C2 vertebrae, and groups of the existing ligaments were developed. The accurate geometry of the model was obtained from the CT data of a 35-year old male. Three screw constructs were designed to compare; Magerl transarticular screw (TA-Screw), Goel-Harms lateral mass screw and pedicle screw (LM-Screw and Pedicle-Screw), and Wright lateral mass screw and translaminar screw (LM-Screw and TL-Screw). Pure moments were applied to the model in the three main planes; flexion (Flex), extension (Ext), axial rotation (AR) and lateral bending (LB). The range of motion (ROM) of C0-C1 and C1-C2 segments for the implanted FE models are compared to the intact FE model and the in vitro study of Panjabi (1988). The Magerl technique showed less effect on the ROM of C0-C1 than the other two techniques in sagittal plane. In lateral bending and axial rotation, the Goel-Harms and Wright techniques showed less effect on the ROM of C0-C1 than the Magerl technique. The Magerl technique has the highest fusion rate as 99% in all loading directions for the C1-C2 segment. The Wright technique has the lowest fusion rate in LB as 79%. The three techniques resulted in the same fusion rate in extension loading as 99%. The maximum stress for the Magerl technique is the lowest in all load direction compared to other two techniques. The maximum stress in all direction was 234 Mpa and occurred in flexion with the Wright technique. The maximum stress for the Goel-Harms and Wright techniques occurred in lateral mass screw. The ROM obtained from the FE results support this idea that the fusion rate of the Magerl is more than 99%. Moreover, the maximum stress occurred in each screw constructs proves the less failure possibility for the Magerl technique. Another advantage of the Magerl technique is the less number of components compared to other techniques using screw constructs. Despite the benefits of the Magerl technique, there are drawbacks to using this method such as reduction of the C1 and C2 before screw placement. Therefore, other fixation methods such as Goel-Harms and Wright techniques find the solution for the drawbacks of the Magerl technique by adding screws separately to C1 and C2. The FE model implanted with the Wright technique showed the highest maximum stress almost in all load direction.

Keywords: cervical spine, finite element model, atlantoaxial, fixation technique

Procedia PDF Downloads 381
8409 Sentiment Analysis: Comparative Analysis of Multilingual Sentiment and Opinion Classification Techniques

Authors: Sannikumar Patel, Brian Nolan, Markus Hofmann, Philip Owende, Kunjan Patel

Abstract:

Sentiment analysis and opinion mining have become emerging topics of research in recent years but most of the work is focused on data in the English language. A comprehensive research and analysis are essential which considers multiple languages, machine translation techniques, and different classifiers. This paper presents, a comparative analysis of different approaches for multilingual sentiment analysis. These approaches are divided into two parts: one using classification of text without language translation and second using the translation of testing data to a target language, such as English, before classification. The presented research and results are useful for understanding whether machine translation should be used for multilingual sentiment analysis or building language specific sentiment classification systems is a better approach. The effects of language translation techniques, features, and accuracy of various classifiers for multilingual sentiment analysis is also discussed in this study.

Keywords: cross-language analysis, machine learning, machine translation, sentiment analysis

Procedia PDF Downloads 709
8408 A Multimodal Measurement Approach Using Narratives and Eye Tracking to Investigate Visual Behaviour in Perceiving Naturalistic and Urban Environments

Authors: Khizar Z. Choudhrya, Richard Coles, Salman Qureshi, Robert Ashford, Salim Khan, Rabia R. Mir

Abstract:

Abstract: The majority of existing landscape research has been derived by conducting heuristic evaluations, without having empirical insight of real participant visual response. In this research, a modern multimodal measurement approach (using narratives and eye tracking) was applied to investigate visual behaviour in perceiving naturalistic and urban environments. This research is unique in exploring gaze behaviour on environmental images possessing different levels of saliency. Eye behaviour is predominantly attracted by salient locations. The concept of methodology of this research on naturalistic and urban environments is drawn from the approaches in market research. Borrowing methodologies from market research that examine visual responses and qualities provided a critical and hitherto unexplored approach. This research has been conducted by using mixed methodological quantitative and qualitative approaches. On the whole, the results of this research corroborated existing landscape research findings, but they also identified potential refinements. The research contributes both methodologically and empirically to human-environment interaction (HEI). This study focused on initial impressions of environmental images with the help of eye tracking. Taking under consideration the importance of the image, this study explored the factors that influence initial fixations in relation to expectations and preferences. In terms of key findings of this research it is noticed that each participant has his own unique navigation style while surfing through different elements of landscape images. This individual navigation style is given the name of ‘visual signature’. This study adds the necessary clarity that would complete the picture and bring an insight for future landscape researchers.

Keywords: human-environment interaction (HEI), multimodal measurement, narratives, eye tracking

Procedia PDF Downloads 334
8407 Dielectric Properties in Frequency Domain of Main Insulation System of Printed Circuit Board

Authors: Xize Dai, Jian Hao, Claus Leth Bak, Gian Carlo Montanari, Huai Wang

Abstract:

Printed Circuit Board (PCB) is a critical component applicable to power electronics systems, especially for high-voltage applications involving several high-voltage and high-frequency SiC/GaN devices. The insulation system of PCB is facing more challenges from high-voltage and high-frequency stress that can alter the dielectric properties. Dielectric properties of the PCB insulation system also determine the electrical field distribution that correlates with intrinsic and extrinsic aging mechanisms. Hence, investigating the dielectric properties in the frequency domain of the PCB insulation system is a must. The paper presents the frequency-dependent, temperature-dependent, and voltage-dependent dielectric properties, permittivity, conductivity, and dielectric loss tangents of PCB insulation systems. The dielectric properties mechanisms associated with frequency, temperature, and voltage are revealed from the design perspective. It can be concluded that the dielectric properties of PCB in the frequency domain show a strong dependence on voltage, frequency, and temperature. The voltage-, frequency-, and temperature-dependent dielectric properties are associated with intrinsic conduction behavior and polarization patterns from the perspective of dielectric theory. The results may provide some reference for the PCB insulation system design in high voltage, high frequency, and high-temperature power electronics applications.

Keywords: electrical insulation system, dielectric properties, high voltage and frequency, printed circuit board

Procedia PDF Downloads 86
8406 Feasibility Study and Experiment of On-Site Nuclear Material Identification in Fukushima Daiichi Fuel Debris by Compact Neutron Source

Authors: Yudhitya Kusumawati, Yuki Mitsuya, Tomooki Shiba, Mitsuru Uesaka

Abstract:

After the Fukushima Daiichi nuclear power reactor incident, there are a lot of unaccountable nuclear fuel debris in the reactor core area, which is subject to safeguard and criticality safety. Before the actual precise analysis is performed, preliminary on-site screening and mapping of nuclear debris activity need to be performed to provide a reliable data on the nuclear debris mass-extraction planning. Through a collaboration project with Japan Atomic Energy Agency, an on-site nuclear debris screening system by using dual energy X-Ray inspection and neutron energy resonance analysis has been established. By using the compact and mobile pulsed neutron source constructed from 3.95 MeV X-Band electron linac, coupled with Tungsten as electron-to-photon converter and Beryllium as a photon-to-neutron converter, short-distance neutron Time of Flight measurement can be performed. Experiment result shows this system can measure neutron energy spectrum up to 100 eV range with only 2.5 meters Time of Flightpath in regards to the X-Band accelerator’s short pulse. With this, on-site neutron Time of Flight measurement can be used to identify the nuclear debris isotope contents through Neutron Resonance Transmission Analysis (NRTA). Some preliminary NRTA experiments have been done with Tungsten sample as dummy nuclear debris material, which isotopes Tungsten-186 has close energy absorption value with Uranium-238 (15 eV). The results obtained shows that this system can detect energy absorption in the resonance neutron area within 1-100 eV. It can also detect multiple elements in a material at once with the experiment using a combined sample of Indium, Tantalum, and silver makes it feasible to identify debris containing mixed material. This compact neutron Time of Flight measurement system is a great complementary for dual energy X-Ray Computed Tomography (CT) method that can identify atomic number quantitatively but with 1-mm spatial resolution and high error bar. The combination of these two measurement methods will able to perform on-site nuclear debris screening at Fukushima Daiichi reactor core area, providing the data for nuclear debris activity mapping.

Keywords: neutron source, neutron resonance, nuclear debris, time of flight

Procedia PDF Downloads 232
8405 The Quality of Economic Growth Regency and Cities in West Java Province: Inclusive Economic Growth

Authors: Fryanto Anugrah Rhamdhani Rhamdhani, Hana Riana Permatasari

Abstract:

The aim of this study analyzes the inclusive of economic growth and analyzes the inclusive of economic growth determinant in regency and city (West Java Province). The background this study Economic Growth can do not afford to reduce poverty, Disparity and expand The Workforce. Referring Central Bureau Of Statistic West Java Province report in 2015 recorded only 5 regions able reduce poverty, 3 regions able reduce Gini Ratio and 7 regions able Workforce Absorption, meanwhile, 11 regions was improved Economic Growth. The Inclusive of Economic Growth definition based on various literature means the quality Economic Growth able reduce Poverty, Gini Ratio, and Workforce absorption. This study adopted the measurement Inclusive Economic of Growth Klassen and analyzes factor in Term Reducing Poverty, Gini Ratio, and the workforce Absorption. Data used panels data composite time series and cross-section including 25 regency and cities regions from Central Bureau Of Statistic West Java Province during 2014-2015. As a result, the measurement inclusive economic of growth Klassen 2014-2015 from 25 regency and cities shows all region does not inclusive reducing Poverty, only 2 regions able reduce Gini Ratio and 3 regions able increase Workforce absorption. Different from the result the measurement Inclusive Economic of Growth for workforce absorption, several regions shows a negative coefficient indicates Economic Growth decline Workforce absorption. The outcome of this study analyzes factor of Inclusive economic of Growth, so that give recommendations for government achieve inclusive economic of growth toward Sustainable Economic. Can be Concluded above low-quality Economic Growth, that due to all region does not inclusive Economic of Growth.

Keywords: inclusive economic growth, Gini ratio, poverty, workforce

Procedia PDF Downloads 258
8404 Comprehensive Review of Adversarial Machine Learning in PDF Malware

Authors: Preston Nabors, Nasseh Tabrizi

Abstract:

Portable Document Format (PDF) files have gained significant popularity for sharing and distributing documents due to their universal compatibility. However, the widespread use of PDF files has made them attractive targets for cybercriminals, who exploit vulnerabilities to deliver malware and compromise the security of end-user systems. This paper reviews notable contributions in PDF malware detection, including static, dynamic, signature-based, and hybrid analysis. It presents a comprehensive examination of PDF malware detection techniques, focusing on the emerging threat of adversarial sampling and the need for robust defense mechanisms. The paper highlights the vulnerability of machine learning classifiers to evasion attacks. It explores adversarial sampling techniques in PDF malware detection to produce mimicry and reverse mimicry evasion attacks, which aim to bypass detection systems. Improvements for future research are identified, including accessible methods, applying adversarial sampling techniques to malicious payloads, evaluating other models, evaluating the importance of features to malware, implementing adversarial defense techniques, and conducting comprehensive examination across various scenarios. By addressing these opportunities, researchers can enhance PDF malware detection and develop more resilient defense mechanisms against adversarial attacks.

Keywords: adversarial attacks, adversarial defense, adversarial machine learning, intrusion detection, PDF malware, malware detection, malware detection evasion

Procedia PDF Downloads 35
8403 Template-less Self-Assembled Morphologically Cubic BiFeO₃ for Improved Electrical Properties

Authors: Jenna Metera, Olivia Graeve

Abstract:

Ceramic capacitor technologies using lead based materials is being phased out for its environmental and handling hazards. Bismuth ferrite (BiFeO₃) is the next best replacement for those lead-based technologies. Unfortunately, the electrical properties in bismuth systems are not as robust as the lead alternatives. The improvement of electrical properties such as charge density, charge anisotropy, relative permittivity, and dielectric loss are the parameters that will make BiFeO₃ a competitive alternative to lead-based ceramic materials. In order to maximize the utility of these properties, we propose the ordering and an evaporation-induced self-assembly of a cubic morphology powder. Evaporation-induced self-assembly is a template-less, bottom-up, self-assembly option. The capillary forces move the particles closer together when the solvent evaporates, promoting organized agglomeration at the particle faces. The assembly of particles into organized structures can lead to enhanced properties compared to unorganized structures or single particles themselves. The interactions between the particles can be controlled based on the long-range order in the organized structure. The cubic particle morphology is produced through a hydrothermal synthesis with changes in the concentration of potassium hydroxide, which changes the morphology of the powder. Once the assembly materializes, the powder is fabricated into workable substrates for electrical testing after consolidation.

Keywords: evaporation, lead-free, morphology, self-assembly

Procedia PDF Downloads 120
8402 One Step Further: Pull-Process-Push Data Processing

Authors: Romeo Botes, Imelda Smit

Abstract:

In today’s modern age of technology vast amounts of data needs to be processed in real-time to keep users satisfied. This data comes from various sources and in many formats, including electronic and mobile devices such as GPRS modems and GPS devices. They make use of different protocols including TCP, UDP, and HTTP/s for data communication to web servers and eventually to users. The data obtained from these devices may provide valuable information to users, but are mostly in an unreadable format which needs to be processed to provide information and business intelligence. This data is not always current, it is mostly historical data. The data is not subject to implementation of consistency and redundancy measures as most other data usually is. Most important to the users is that the data are to be pre-processed in a readable format when it is entered into the database. To accomplish this, programmers build processing programs and scripts to decode and process the information stored in databases. Programmers make use of various techniques in such programs to accomplish this, but sometimes neglect the effect some of these techniques may have on database performance. One of the techniques generally used,is to pull data from the database server, process it and push it back to the database server in one single step. Since the processing of the data usually takes some time, it keeps the database busy and locked for the period of time that the processing takes place. Because of this, it decreases the overall performance of the database server and therefore the system’s performance. This paper follows on a paper discussing the performance increase that may be achieved by utilizing array lists along with a pull-process-push data processing technique split in three steps. The purpose of this paper is to expand the number of clients when comparing the two techniques to establish the impact it may have on performance of the CPU storage and processing time.

Keywords: performance measures, algorithm techniques, data processing, push data, process data, array list

Procedia PDF Downloads 238
8401 An Evaluation of Different Weed Management Techniques in Organic Arable Systems

Authors: Nicola D. Cannon

Abstract:

A range of field experiments have been conducted since 1991 to 2017 on organic land at the Royal Agricultural University’s Harnhill Manor Farm near Cirencester, UK to explore the impact of different management practices on weed infestation in organic winter and spring wheat. The experiments were designed using randomised complete block and some with split plot arrangements. Sowing date, variety choice, crop height and crop establishment technique have all shown a significant impact on weed infestations. Other techniques have also been investigated but with less clear, but, still often significant effects on weed control including grazing with sheep, undersowing with different legumes and mechanical weeding techniques. Tillage treatments included traditional plough based systems, minimum tillage and direct drilling. Direct drilling had significantly higher weed dry matter than the other two techniques. Taller wheat varieties which do not contain Rht1 or Rht2 had higher weed populations than the wheat without dwarfing genes. Early sown winter wheat had greater weed dry matter than later sown wheat. Grazing with sheep interacted strongly with sowing date, with shorter varieties and also late sowing dates providing much less forage but, grazing did reduce weed biomass in June. Undersowing had mixed impacts which were related to the success of establishment of the undersown legume crop. Weeds are most successfully controlled when a range of techniques are implemented to give the wheat crop the greatest chance of competing with weeds.

Keywords: crop establishment, drilling date, grazing, undersowing, varieties, weeds

Procedia PDF Downloads 180
8400 Photovoltaic Modules Fault Diagnosis Using Low-Cost Integrated Sensors

Authors: Marjila Burhanzoi, Kenta Onohara, Tomoaki Ikegami

Abstract:

Faults in photovoltaic (PV) modules should be detected to the greatest extent as early as possible. For that conventional fault detection methods such as electrical characterization, visual inspection, infrared (IR) imaging, ultraviolet fluorescence and electroluminescence (EL) imaging are used, but they either fail to detect the location or category of fault, or they require expensive equipment and are not convenient for onsite application. Hence, these methods are not convenient to use for monitoring small-scale PV systems. Therefore, low cost and efficient inspection techniques with the ability of onsite application are indispensable for PV modules. In this study in order to establish efficient inspection technique, correlation between faults and magnetic flux density on the surface is of crystalline PV modules are investigated. Magnetic flux on the surface of normal and faulted PV modules is measured under the short circuit and illuminated conditions using two different sensor devices. One device is made of small integrated sensors namely 9-axis motion tracking sensor with a 3-axis electronic compass embedded, an IR temperature sensor, an optical laser position sensor and a microcontroller. This device measures the X, Y and Z components of the magnetic flux density (Bx, By and Bz) few mm above the surface of a PV module and outputs the data as line graphs in LabVIEW program. The second device is made of a laser optical sensor and two magnetic line sensor modules consisting 16 pieces of magnetic sensors. This device scans the magnetic field on the surface of PV module and outputs the data as a 3D surface plot of the magnetic flux intensity in a LabVIEW program. A PC equipped with LabVIEW software is used for data acquisition and analysis for both devices. To show the effectiveness of this method, measured results are compared to those of a normal reference module and their EL images. Through the experiments it was confirmed that the magnetic field in the faulted areas have different profiles which can be clearly identified in the measured plots. Measurement results showed a perfect correlation with the EL images and using position sensors it identified the exact location of faults. This method was applied on different modules and various faults were detected using it. The proposed method owns the ability of on-site measurement and real-time diagnosis. Since simple sensors are used to make the device, it is low cost and convenient to be sued by small-scale or residential PV system owners.

Keywords: fault diagnosis, fault location, integrated sensors, PV modules

Procedia PDF Downloads 222
8399 Technological Enhancements in Supply Chain Management Post COVID-19

Authors: Miran Ismail

Abstract:

COVID-19 has caused widespread disruption in all economical sectors and industries around the world. The COVID-19 lockdown measures have resulted in production halts, restrictions on persons and goods movement, border closures, logistical constraints, and a slowdown in trade and economic activity. The main subject of this paper is to leverage technology to manage the supply chain effectively and efficiently through the usage of artificial intelligence. The research methodology is based on empirical data collected through a questionnaire survey. One of the approaches utilized is a case study of industrial organizations that face obstacles such as high operational costs, large inventory levels, a lack of well-established supplier relationships, human behavior, and system issues. The main contribution of this research to the body of knowledge is the empirical insights and on supply chain sustainability performance measurement. The results provide guidelines for the selection of advanced technologies to support supply chain processes and for the design of sustainable performance measurement systems.

Keywords: information technology, artificial intelligence, supply chain management, industrial organizations

Procedia PDF Downloads 119