Search results for: data mining technique
27432 Recent Development on Application of Microwave Energy on Process Metallurgy
Authors: Mamdouh Omran, Timo Fabritius
Abstract:
A growing interest in microwave heating has emerged recently. Many researchers have begun to pay attention to microwave energy as an alternative technique for processing various primary and secondary raw materials. Compared to conventional methods, microwave processing offers several advantages, such as selective heating, rapid heating, and volumetric heating. The present study gives a summary on our recent works related to the use of microwave energy for the recovery of valuable metals from primary and secondary raw materials. The research is mainly focusing on: Application of microwave for the recovery and recycling of metals from different metallurgical industries wastes (i.e. electric arc furnace (EAF) dust, blast furnace (BF), basic oxygen furnace (BOF) sludge). Application of microwave for upgrading and recovery of valuable metals from primary raw materials (i.e. iron ore). The results indicated that microwave heating is a promising and effective technique for processing primary and secondary steelmaking wastes. After microwave treatment of iron ore for 60 s and 900 W, about a 28.30% increase in grindability.Wet high intensity magnetic separation (WHIMS) indicated that the magnetic separation increased from 34% to 98% after microwave treatment for 90 s and 900 W. In the case of EAF dust, after microwave processing at 1100 W for 20 min, Zinc removal from 64 % to ~ 97 %, depending on mixture ratio and treatment time.Keywords: dielectric properties, microwave heating, raw materials, secondary raw materials
Procedia PDF Downloads 9827431 Electrokinetic Application for the Improvement of Soft Clays
Authors: Abiola Ayopo Abiodun, Zalihe Nalbantoglu
Abstract:
The electrokinetic application (EKA), a relatively modern chemical treatment has a potential for in-situ ground improvement in an open field or under existing structures. It utilizes a low electrical gradient to transport electrolytic chemical ions between bespoke electrodes inserted in the fine-grained, low permeable soft soils. The paper investigates the efficacy of the EKA as a mitigation technique for the soft clay beds. The laboratory model of the EKA comprises of rectangular plexiglass test tank, electrolytes compartments, geosynthetic electrodes and direct electric current supply. Within this setup, the EK effects resulted from the exchange of ions between anolyte (anodic) and catholyte (cathodic) ends through the tested soil were examined by basic experimental laboratory testing methods. As such, the treated soft soil properties were investigated as a function of the anode-to-cathode distances and curing periods. The test results showed that there have been some changes in the physical and engineering properties of the treated soft soils. The significant changes in the physicochemical and electrical properties suggested that their corresponding changes can be utilized as a monitoring technique to evaluate the improvement in the engineering properties EK treated soft clay soils.Keywords: electrokinetic, electrolytes, exchange ions, geosynthetic electrodes, soft soils
Procedia PDF Downloads 31827430 Ontology for a Voice Transcription of OpenStreetMap Data: The Case of Space Apprehension by Visually Impaired Persons
Authors: Said Boularouk, Didier Josselin, Eitan Altman
Abstract:
In this paper, we present a vocal ontology of OpenStreetMap data for the apprehension of space by visually impaired people. Indeed, the platform based on produsage gives a freedom to data producers to choose the descriptors of geocoded locations. Unfortunately, this freedom, called also folksonomy leads to complicate subsequent searches of data. We try to solve this issue in a simple but usable method to extract data from OSM databases in order to send them to visually impaired people using Text To Speech technology. We focus on how to help people suffering from visual disability to plan their itinerary, to comprehend a map by querying computer and getting information about surrounding environment in a mono-modal human-computer dialogue.Keywords: TTS, ontology, open street map, visually impaired
Procedia PDF Downloads 29827429 Computerized Scoring System: A Stethoscope to Understand Consumer's Emotion through His or Her Feedback
Authors: Chen Yang, Jun Hu, Ping Li, Lili Xue
Abstract:
Most companies pay careful attention to consumer feedback collection, so it is popular to find the ‘feedback’ button of all kinds of mobile apps. Yet it is much more changeling to analyze these feedback texts and to catch the true feelings of a consumer regarding either a problem or a complimentary of consumers who hands out the feedback. Especially to the Chinese content, it is possible that; in one context the Chinese feedback expresses positive feedback, but in the other context, the same Chinese feedback may be a negative one. For example, in Chinese, the feedback 'operating with loudness' works well with both refrigerator and stereo system. Apparently, this feedback towards a refrigerator shows negative feedback; however, the same feedback is positive towards a stereo system. By introducing Bradley, M. and Lang, P.'s Affective Norms for English Text (ANET) theory and Bucci W.’s Referential Activity (RA) theory, we, usability researchers at Pingan, are able to decipher the feedback and to find the hidden feelings behind the content. We subtract 2 disciplines ‘valence’ and ‘dominance’ out of 3 of ANET and 2 disciplines ‘concreteness’ and ‘specificity’ out of 4 of RA to organize our own rating system with a scale of 1 to 5 points. This rating system enables us to judge the feelings/emotion behind each feedback, and it works well with both single word/phrase and a whole paragraph. The result of the rating reflects the strength of the feeling/emotion of the consumer when he/she is typing the feedback. In our daily work, we first require a consumer to answer the net promoter score (NPS) before writing the feedback, so we can determine the feedback is positive or negative. Secondly, we code the feedback content according to company problematic list, which contains 200 problematic items. In this way, we are able to collect the data that how many feedbacks left by the consumer belong to one typical problem. Thirdly, we rate each feedback based on the rating system mentioned above to illustrate the strength of the feeling/emotion when our consumer writes the feedback. In this way, we actually obtain two kinds of data 1) the portion, which means how many feedbacks are ascribed into one problematic item and 2) the severity, how strong the negative feeling/emotion is when the consumer is writing this feedback. By crossing these two, and introducing the portion into X-axis and severity into Y-axis, we are able to find which typical problem gets the high score in both portion and severity. The higher the score of a problem has, the more urgent a problem is supposed to be solved as it means more people write stronger negative feelings in feedbacks regarding this problem. Moreover, by introducing hidden Markov model to program our rating system, we are able to computerize the scoring system and are able to process thousands of feedback in a short period of time, which is efficient and accurate enough for the industrial purpose.Keywords: computerized scoring system, feeling/emotion of consumer feedback, referential activity, text mining
Procedia PDF Downloads 17927428 Microgravity, Hydrological and Metrological Monitoring of Shallow Ground Water Aquifer in Al-Ain, UAE
Authors: Serin Darwish, Hakim Saibi, Amir Gabr
Abstract:
The United Arab Emirates (UAE) is situated within an arid zone where the climate is arid and the recharge of the groundwater is very low. Groundwater is the primary source of water in the United Arab Emirates. However, rapid expansion, population growth, agriculture, and industrial activities have negatively affected these limited water resources. The shortage of water resources has become a serious concern due to the over-pumping of groundwater to meet demand. In addition to the deficit of groundwater, the UAE has one of the highest per capita water consumption rates in the world. In this study, a combination of time-lapse measurements of microgravity and depth to groundwater level in selected wells in Al Ain city was used to estimate the variations in groundwater storage. Al-Ain is the second largest city in Abu Dhabi Emirates and the third largest city in the UAE. The groundwater in this region has been overexploited. Relative gravity measurements were acquired using the Scintrex CG-6 Autograv. This latest generation gravimeter from Scintrex Ltd provides fast, precise gravity measurements and automated corrections for temperature, tide, instrument tilt and rejection of data noise. The CG-6 gravimeter has a resolution of 0.1μGal. The purpose of this study is to measure the groundwater storage changes in the shallow aquifers based on the application of microgravity method. The gravity method is a nondestructive technique that allows collection of data at almost any location over the aquifer. Preliminary results indicate a possible relationship between microgravity and water levels, but more work needs to be done to confirm this. The results will help to develop the relationship between monthly microgravity changes with hydrological and hydrogeological changes of shallow phreatic. The study will be useful in water management considerations and additional future investigations.Keywords: Al-Ain, arid region, groundwater, microgravity
Procedia PDF Downloads 15527427 Prediction of Marine Ecosystem Changes Based on the Integrated Analysis of Multivariate Data Sets
Authors: Prozorkevitch D., Mishurov A., Sokolov K., Karsakov L., Pestrikova L.
Abstract:
The current body of knowledge about the marine environment and the dynamics of marine ecosystems includes a huge amount of heterogeneous data collected over decades. It generally includes a wide range of hydrological, biological and fishery data. Marine researchers collect these data and analyze how and why the ecosystem changes from past to present. Based on these historical records and linkages between the processes it is possible to predict future changes. Multivariate analysis of trends and their interconnection in the marine ecosystem may be used as an instrument for predicting further ecosystem evolution. A wide range of information about the components of the marine ecosystem for more than 50 years needs to be used to investigate how these arrays can help to predict the future.Keywords: barents sea ecosystem, abiotic, biotic, data sets, trends, prediction
Procedia PDF Downloads 11927426 Optical Fiber Data Throughput in a Quantum Communication System
Authors: Arash Kosari, Ali Araghi
Abstract:
A mathematical model for an optical-fiber communication channel is developed which results in an expression that calculates the throughput and loss of the corresponding link. The data are assumed to be transmitted by using of separate photons with different polarizations. The derived model also shows the dependency of data throughput with length of the channel and depolarization factor. It is observed that absorption of photons affects the throughput in a more intensive way in comparison with that of depolarization. Apart from that, the probability of depolarization and the absorption of radiated photons are obtained.Keywords: absorption, data throughput, depolarization, optical fiber
Procedia PDF Downloads 28827425 Cold Spray Deposition of SS316L Powders on Al5052 Substrates and Their Potential Using for Biomedical Applications
Authors: B. Dikici, I. Ozdemir, M. Topuz
Abstract:
The corrosion behaviour of 316L stainless steel coatings obtained by cold spray method was investigated in this study. 316L powders were deposited onto Al5052 aluminum substrates. The coatings were produced using nitrogen (N2) process gas. In order to further improve the corrosion and mechanical properties of the coatings, heat treatment was applied at 250 and 750 °C. The corrosion performances of the coatings were compared using the potentiodynamic scanning (PDS) technique under in-vitro conditions (in Ringer’s solution at 37 °C). In addition, the hardness and porosity tests were carried out on the coatings. Microstructural characterization of the coatings was carried out by using scanning electron microscopy attached with energy dispersive spectrometer (SEM-EDS) and X-ray diffraction (XRD) technique. It was found that clean surfaces and a good adhesion were achieved for particle/substrate bonding. The heat treatment process provided both elimination of the anisotropy in the coating and resulting in healing-up of the incomplete interfaces between the deposited particles. It was found that the corrosion potential of the annealed coatings at 750 °C was higher than that of commercially 316 L stainless steel. Moreover, the microstructural investigations after the corrosion tests revealed that corrosion preferentially starts at inter-splat boundaries.Keywords: biomaterials, cold spray, 316L, corrosion, heat treatment
Procedia PDF Downloads 37127424 Forecasting Performance Comparison of Autoregressive Fractional Integrated Moving Average and Jordan Recurrent Neural Network Models on the Turbidity of Stream Flows
Authors: Daniel Fulus Fom, Gau Patrick Damulak
Abstract:
In this study, the Autoregressive Fractional Integrated Moving Average (ARFIMA) and Jordan Recurrent Neural Network (JRNN) models were employed to model the forecasting performance of the daily turbidity flow of White Clay Creek (WCC). The two methods were applied to the log difference series of the daily turbidity flow series of WCC. The measurements of error employed to investigate the forecasting performance of the ARFIMA and JRNN models are the Root Mean Square Error (RMSE) and the Mean Absolute Error (MAE). The outcome of the investigation revealed that the forecasting performance of the JRNN technique is better than the forecasting performance of the ARFIMA technique in the mean square error sense. The results of the ARFIMA and JRNN models were obtained by the simulation of the models using MATLAB version 8.03. The significance of using the log difference series rather than the difference series is that the log difference series stabilizes the turbidity flow series than the difference series on the ARFIMA and JRNN.Keywords: auto regressive, mean absolute error, neural network, root square mean error
Procedia PDF Downloads 26827423 Event Driven Dynamic Clustering and Data Aggregation in Wireless Sensor Network
Authors: Ashok V. Sutagundar, Sunilkumar S. Manvi
Abstract:
Energy, delay and bandwidth are the prime issues of wireless sensor network (WSN). Energy usage optimization and efficient bandwidth utilization are important issues in WSN. Event triggered data aggregation facilitates such optimal tasks for event affected area in WSN. Reliable delivery of the critical information to sink node is also a major challenge of WSN. To tackle these issues, we propose an event driven dynamic clustering and data aggregation scheme for WSN that enhances the life time of the network by minimizing redundant data transmission. The proposed scheme operates as follows: (1) Whenever the event is triggered, event triggered node selects the cluster head. (2) Cluster head gathers data from sensor nodes within the cluster. (3) Cluster head node identifies and classifies the events out of the collected data using Bayesian classifier. (4) Aggregation of data is done using statistical method. (5) Cluster head discovers the paths to the sink node using residual energy, path distance and bandwidth. (6) If the aggregated data is critical, cluster head sends the aggregated data over the multipath for reliable data communication. (7) Otherwise aggregated data is transmitted towards sink node over the single path which is having the more bandwidth and residual energy. The performance of the scheme is validated for various WSN scenarios to evaluate the effectiveness of the proposed approach in terms of aggregation time, cluster formation time and energy consumed for aggregation.Keywords: wireless sensor network, dynamic clustering, data aggregation, wireless communication
Procedia PDF Downloads 45227422 The Production of Biofertilizer from Naturally Occurring Microorganisms by Using Nuclear Technologies
Authors: K. S. Al-Mugren, A. Yahya, S. Alodah, R. Alharbi, S. H. Almsaid , A. Alqahtani, H. Jaber, A. Basaqer, N. Alajra, N. Almoghati, A. Alsalman, Khalid Alharbi
Abstract:
Context: The production of biofertilizers from naturally occurring microorganisms is an area of research that aims to enhance agricultural practices by utilizing local resources. This research project focuses on isolating and screening indigenous microorganisms with PK-fixing and phosphate solubilizing characteristics from local sources. Research Aim: The aim of this project is to develop a biofertilizer product using indigenous microorganisms and composted agro waste as a carrier. The objective is to enhance crop productivity and soil fertility through the application of biofertilizers. Methodology: The research methodology includes several key steps. Firstly, indigenous microorganisms will be isolated from local resources using the ten-fold serial dilutions technique. Screening assays will be conducted to identify microorganisms with phosphate solubilizing and PK-fixing activities. Agro-waste materials will be collected from local agricultural sources, and composting experiments will be conducted to convert them into organic matter-rich compost. Physicochemical analysis will be performed to assess the composition of the composted agro-waste. Gamma and X-ray irradiation will be used to sterilize the carrier material. The sterilized carrier will be tested for sterility using the ten-fold serial dilutions technique. Finally, selected indigenous microorganisms will be developed into biofertilizer products. Findings: The research aims to find suitable indigenous microorganisms with phosphate solubilizing and PK-fixing characteristics for biofertilizer production. Additionally, the research aims to assess the suitability of composted agro waste as a carrier for biofertilizers. The impact of gamma irradiation sterilization on pathogen elimination will also be investigated. Theoretical Importance: This research contributes to the understanding of utilizing indigenous microorganisms and composted agro waste for biofertilizer production. It expands knowledge on the potential benefits of biofertilizers in enhancing crop productivity and soil fertility. Data Collection and Analysis Procedures: The data collection process involves isolating indigenous microorganisms, conducting screening assays, collecting and composting agro waste, analyzing the physicochemical composition of composted agro waste, and testing carrier sterilization. The analysis procedures include assessing the abilities of indigenous microorganisms, evaluating the composition of composted agro waste, and determining the sterility of the carrier material. Conclusion: The research project aims to develop biofertilizer products using indigenous microorganisms and composted agro waste as a carrier. Through the isolation and screening of indigenous microorganisms, the project aims to enhance crop productivity and soil fertility by utilizing local resources. The research findings will contribute to the understanding of the suitability of composted agro waste as a carrier and the efficacy of gamma irradiation sterilization. The research outcomes will have theoretical importance in the field of biofertilizer production and agricultural practices.Keywords: biofertilizer, microorganisms, agro waste, nuclear technologies
Procedia PDF Downloads 14227421 Offshore Outsourcing: Global Data Privacy Controls and International Compliance Issues
Authors: Michelle J. Miller
Abstract:
In recent year, there has been a rise of two emerging issues that impact the global employment and business market that the legal community must review closer: offshore outsourcing and data privacy. These two issues intersect because employment opportunities are shifting due to offshore outsourcing and some States, like the United States, anti-outsourcing legislation has been passed or presented to retain jobs within the country. In addition, the legal requirements to retain the privacy of data as a global employer extends to employees and third party service provides, including services outsourced to offshore locations. For this reason, this paper will review the intersection of these two issues with a specific focus on data privacy.Keywords: outsourcing, data privacy, international compliance, multinational corporations
Procedia PDF Downloads 41227420 Weighted Data Replication Strategy for Data Grid Considering Economic Approach
Authors: N. Mansouri, A. Asadi
Abstract:
Data Grid is a geographically distributed environment that deals with data intensive application in scientific and enterprise computing. Data replication is a common method used to achieve efficient and fault-tolerant data access in Grids. In this paper, a dynamic data replication strategy, called Enhanced Latest Access Largest Weight (ELALW) is proposed. This strategy is an enhanced version of Latest Access Largest Weight strategy. However, replication should be used wisely because the storage capacity of each Grid site is limited. Thus, it is important to design an effective strategy for the replication replacement task. ELALW replaces replicas based on the number of requests in future, the size of the replica, and the number of copies of the file. It also improves access latency by selecting the best replica when various sites hold replicas. The proposed replica selection selects the best replica location from among the many replicas based on response time that can be determined by considering the data transfer time, the storage access latency, the replica requests that waiting in the storage queue and the distance between nodes. Simulation results utilizing the OptorSim show our replication strategy achieve better performance overall than other strategies in terms of job execution time, effective network usage and storage resource usage.Keywords: data grid, data replication, simulation, replica selection, replica placement
Procedia PDF Downloads 26127419 Treating Voxels as Words: Word-to-Vector Methods for fMRI Meta-Analyses
Authors: Matthew Baucum
Abstract:
With the increasing popularity of fMRI as an experimental method, psychology and neuroscience can greatly benefit from advanced techniques for summarizing and synthesizing large amounts of data from brain imaging studies. One promising avenue is automated meta-analyses, in which natural language processing methods are used to identify the brain regions consistently associated with certain semantic concepts (e.g. “social”, “reward’) across large corpora of studies. This study builds on this approach by demonstrating how, in fMRI meta-analyses, individual voxels can be treated as vectors in a semantic space and evaluated for their “proximity” to terms of interest. In this technique, a low-dimensional semantic space is built from brain imaging study texts, allowing words in each text to be represented as vectors (where words that frequently appear together are near each other in the semantic space). Consequently, each voxel in a brain mask can be represented as a normalized vector sum of all of the words in the studies that showed activation in that voxel. The entire brain mask can then be visualized in terms of each voxel’s proximity to a given term of interest (e.g., “vision”, “decision making”) or collection of terms (e.g., “theory of mind”, “social”, “agent”), as measured by the cosine similarity between the voxel’s vector and the term vector (or the average of multiple term vectors). Analysis can also proceed in the opposite direction, allowing word cloud visualizations of the nearest semantic neighbors for a given brain region. This approach allows for continuous, fine-grained metrics of voxel-term associations, and relies on state-of-the-art “open vocabulary” methods that go beyond mere word-counts. An analysis of over 11,000 neuroimaging studies from an existing meta-analytic fMRI database demonstrates that this technique can be used to recover known neural bases for multiple psychological functions, suggesting this method’s utility for efficient, high-level meta-analyses of localized brain function. While automated text analytic methods are no replacement for deliberate, manual meta-analyses, they seem to show promise for the efficient aggregation of large bodies of scientific knowledge, at least on a relatively general level.Keywords: FMRI, machine learning, meta-analysis, text analysis
Procedia PDF Downloads 45027418 Evaluation of Satellite and Radar Rainfall Product over Seyhan Plain
Authors: Kazım Kaba, Erdem Erdi, M. Akif Erdoğan, H. Mustafa Kandırmaz
Abstract:
Rainfall is crucial data source for very different discipline such as agriculture, hydrology and climate. Therefore rain rate should be known well both spatial and temporal for any area. Rainfall is measured by using rain-gauge at meteorological ground stations traditionally for many years. At the present time, rainfall products are acquired from radar and satellite images with a temporal and spatial continuity. In this study, we investigated the accuracy of these rainfall data according to rain-gauge data. For this purpose, we used Adana-Hatay radar hourly total precipitation product (RN1) and Meteosat convective rainfall rate (CRR) product over Seyhan plain. We calculated daily rainfall values from RN1 and CRR hourly precipitation products. We used the data of rainy days of four stations located within range of the radar from October 2013 to November 2015. In the study, we examined two rainfall data over Seyhan plain and the correlation between the rain-gauge data and two raster rainfall data was observed lowly.Keywords: meteosat, radar, rainfall, rain-gauge, Turkey
Procedia PDF Downloads 32827417 Evaluation of Condyle Alterations after Orthognathic Surgery with a Digital Image Processing Technique
Authors: Livia Eisler, Cristiane C. B. Alves, Cristina L. F. Ortolani, Kurt Faltin Jr.
Abstract:
Purpose: This paper proposes a technically simple diagnosis method among orthodontists and maxillofacial surgeons in order to evaluate discrete bone alterations. The methodology consists of a protocol to optimize the diagnosis and minimize the possibility for orthodontic and ortho-surgical retreatment. Materials and Methods: A protocol of image processing and analysis, through ImageJ software and its plugins, was applied to 20 pairs of lateral cephalometric images obtained from cone beam computerized tomographies, before and 1 year after undergoing orthognathic surgery. The optical density of the images was analyzed in the condylar region to determine possible bone alteration after surgical correction. Results: Image density was shown to be altered in all image pairs, especially regarding the condyle contours. According to measures, condyle had a gender-related density reduction for p=0.05 and condylar contours had their alterations registered in mm. Conclusion: A simple, viable and cost-effective technique can be applied to achieve the more detailed image-based diagnosis, not depending on the human eye and therefore, offering more reliable, quantitative results.Keywords: bone resorption, computer-assisted image processing, orthodontics, orthognathic surgery
Procedia PDF Downloads 16127416 Functionalized Nano porous Ceramic Membranes for Electrodialysis Treatment of Harsh Wastewater
Authors: Emily Rabe, Stephanie Candelaria, Rachel Malone, Olivia Lenz, Greg Newbloom
Abstract:
Electrodialysis (ED) is a well-developed technology for ion removal in a variety of applications. However, many industries generate harsh wastewater streams that are incompatible with traditional ion exchange membranes. Membrion® has developed novel ceramic-based ion exchange membranes (IEMs) offering several advantages over traditional polymer membranes: high performance in low pH, chemical resistance to oxidizers, and a rigid structure that minimizes swelling. These membranes are synthesized with our patented silane-based sol-gel techniques. The pore size, shape, and network structure are engineered through a molecular self-assembly process where thermodynamic driving forces are used to direct where and how pores form. Either cationic or anionic groups can be added within the membrane nanopore structure to create cation- and anion-exchange membranes. The ceramic IEMs are produced on a roll-to-roll manufacturing line with low-temperature processing. Membrane performance testing is conducted using in-house permselectivity, area-specific resistance, and ED stack testing setups. Ceramic-based IEMs show comparable performance to traditional IEMs and offer some unique advantages. Long exposure to highly acidic solutions has a negligible impact on ED performance. Additionally, we have observed stable performance in the presence of strong oxidizing agents such as hydrogen peroxide. This stability is expected, as the ceramic backbone of these materials is already in a fully oxidized state. This data suggests ceramic membranes, made using sol-gel chemistry, could be an ideal solution for acidic and/or oxidizing wastewater streams from processes such as semiconductor manufacturing and mining.Keywords: ion exchange, membrane, silane chemistry, nanostructure, wastewater
Procedia PDF Downloads 8727415 The Role of Video in Teaching and Learning Pronunciation: A Case Study
Authors: Kafi Razzaq Ahmed
Abstract:
Speaking fluently in a second language requires vocabulary, grammar, and pronunciation skills. Teaching the English language entails teaching pronunciation. In professional literature, there have been a lot of attempts to integrate technology into improving the pronunciation of learners. The technique is also neglected in Kurdish contexts, Salahaddin University – Erbil included. Thus, the main aim of the research is to point out the efficiency of using video materials for both language teachers and learners within and beyond classroom learning and teaching environments to enhance student's pronunciation. To collect practical data, a research project has been designed. In subsequent research, a posttest will be administered after each lesson to 100 first-year students at Salahaddin University-Erbil English departments. All students will be taught the same material using different methods, one based on video materials and the other based on the traditional approach to teaching pronunciation. Finally, the results of both tests will be analyzed (also knowing the attitudes of both the teachers and the students about both lessons) to indicate the impact of using video in the process of teaching and learning pronunciation.Keywords: video, pronunciation, teaching, learning
Procedia PDF Downloads 11127414 Major Variables Influencing Marketed Surplus of Seed Cotton in District Khanewal, Pakistan
Authors: Manan Aslam, Shafqat Rasool
Abstract:
This paper attempts to examine impact of major factors affecting marketed surplus of seed cotton in district Khanewal (Punjab) using primary source of data. A representative sample of 40 cotton farmers was selected using stratified random sampling technique. The impact of major factors on marketed surplus of seed cotton growers was estimated by employing double log form of regression analysis. The value of adjusted R2 was 0.64 whereas the F-value was 10.81. The findings of analysis revealed that experience of farmers, education of farmers, area under cotton crop and distance from wholesale market were the significant variables affecting marketed surplus of cotton whereas the variables (marketing cost and sale price) showed insignificant impact. The study suggests improving prevalent marketing practices to increase volume of marketed surplus of cotton in district Khanewal.Keywords: seed cotton, marketed surplus, double log regression analysis
Procedia PDF Downloads 30927413 Engineering Seismological Studies in and around Zagazig City, Sharkia, Egypt
Authors: M. El-Eraki, A. A. Mohamed, A. A. El-Kenawy, M. S. Toni, S. I. Mustafa
Abstract:
The aim of this paper is to study the ground vibrations using Nakamura technique to evaluate the relation between the ground conditions and the earthquake characteristics. Microtremor measurements were carried out at 55 sites in and around Zagazig city. The signals were processed using horizontal to vertical spectral ratio (HVSR) technique to estimate the fundamental frequencies of the soil deposits and its corresponding H/V amplitude. Seismic measurements were acquired at nine sites for recording the surface waves. The recorded waveforms were processed using the multi-channel analysis of surface waves (MASW) method to infer the shear wave velocity profile. The obtained fundamental frequencies were found to be ranging from 0.7 to 1.7 Hz and the maximum H/V amplitude reached 6.4. These results together with the average shear wave velocity in the surface layers were used for the estimation of the thickness of the upper most soft cover layers (depth to bedrock). The sediment thickness generally increases at the northeastern and southwestern parts of the area, which is in good agreement with the local geological structure. The results of this work showed the zones of higher potential damage in the event of an earthquake in the study area.Keywords: ambient vibrations, fundamental frequency, surface waves, zagazig
Procedia PDF Downloads 28427412 Development of a Tilt-Rotor Aircraft Model Using System Identification Technique
Authors: Ferdinando Montemari, Antonio Vitale, Nicola Genito, Giovanni Cuciniello
Abstract:
The introduction of tilt-rotor aircraft into the existing civilian air transportation system will provide beneficial effects due to tilt-rotor capability to combine the characteristics of a helicopter and a fixed-wing aircraft into one vehicle. The disposability of reliable tilt-rotor simulation models supports the development of such vehicle. Indeed, simulation models are required to design automatic control systems that increase safety, reduce pilot's workload and stress, and ensure the optimal aircraft configuration with respect to flight envelope limits, especially during the most critical flight phases such as conversion from helicopter to aircraft mode and vice versa. This article presents a process to build a simplified tilt-rotor simulation model, derived from the analysis of flight data. The model aims to reproduce the complex dynamics of tilt-rotor during the in-flight conversion phase. It uses a set of scheduled linear transfer functions to relate the autopilot reference inputs to the most relevant rigid body state variables. The model also computes information about the rotor flapping dynamics, which are useful to evaluate the aircraft control margin in terms of rotor collective and cyclic commands. The rotor flapping model is derived through a mixed theoretical-empirical approach, which includes physical analytical equations (applicable to helicopter configuration) and parametric corrective functions. The latter are introduced to best fit the actual rotor behavior and balance the differences existing between helicopter and tilt-rotor during flight. Time-domain system identification from flight data is exploited to optimize the model structure and to estimate the model parameters. The presented model-building process was applied to simulated flight data of the ERICA Tilt-Rotor, generated by using a high fidelity simulation model implemented in FlightLab environment. The validation of the obtained model was very satisfying, confirming the validity of the proposed approach.Keywords: flapping dynamics, flight dynamics, system identification, tilt-rotor modeling and simulation
Procedia PDF Downloads 20227411 Data-Driven Dynamic Overbooking Model for Tour Operators
Authors: Kannapha Amaruchkul
Abstract:
We formulate a dynamic overbooking model for a tour operator, in which most reservations contain at least two people. The cancellation rate and the timing of the cancellation may depend on the group size. We propose two overbooking policies, namely economic- and service-based. In an economic-based policy, we want to minimize the expected oversold and underused cost, whereas, in a service-based policy, we ensure that the probability of an oversold situation does not exceed the pre-specified threshold. To illustrate the applicability of our approach, we use tour package data in 2016-2018 from a tour operator in Thailand to build a data-driven robust optimization model, and we tested the proposed overbooking policy in 2019. We also compare the data-driven approach to the conventional approach of fitting data into a probability distribution.Keywords: applied stochastic model, data-driven robust optimization, overbooking, revenue management, tour operator
Procedia PDF Downloads 13527410 The Implementation of Character Education in Code Riverbanks, Special Region of Yogyakarta, Indonesia
Authors: Ulil Afidah, Muhamad Fathan Mubin, Firdha Aulia
Abstract:
Code riverbanks Yogyakarta is a settlement area with middle to lower social classes. Socio-economic situation is affecting the behavior of society. This research aimed to find and explain the implementation and the assessment of character education which were done in elementary schools in Code riverside, Yogyakarta region of Indonesia. This research is a qualitative research which the subjects were the kids of Code riverbanks, Yogyakarta. The data were collected through interviews and document studies and analyzed qualitatively using the technique of interactive analysis model of Miles and Huberman. The results show that: (1) The learning process of character education was done by integrating all aspects such as democratic and interactive learning session also introducing role model to the students. 2) The assessment of character education was done by teacher based on teaching and learning process and an activity in outside the classroom that was the criterion on three aspects: Cognitive, affective and psychomotor.Keywords: character, Code riverbanks, education, Yogyakarta
Procedia PDF Downloads 25127409 Analysis of Vocal Fold Vibrations from High-Speed Digital Images Based on Dynamic Time Warping
Authors: A. I. A. Rahman, Sh-Hussain Salleh, K. Ahmad, K. Anuar
Abstract:
Analysis of vocal fold vibration is essential for understanding the mechanism of voice production and for improving clinical assessment of voice disorders. This paper presents a Dynamic Time Warping (DTW) based approach to analyze and objectively classify vocal fold vibration patterns. The proposed technique was designed and implemented on a Glottal Area Waveform (GAW) extracted from high-speed laryngeal images by delineating the glottal edges for each image frame. Feature extraction from the GAW was performed using Linear Predictive Coding (LPC). Several types of voice reference templates from simulations of clear, breathy, fry, pressed and hyperfunctional voice productions were used. The patterns of the reference templates were first verified using the analytical signal generated through Hilbert transformation of the GAW. Samples from normal speakers’ voice recordings were then used to evaluate and test the effectiveness of this approach. The classification of the voice patterns using the technique of LPC and DTW gave the accuracy of 81%.Keywords: dynamic time warping, glottal area waveform, linear predictive coding, high-speed laryngeal images, Hilbert transform
Procedia PDF Downloads 24027408 Modeling and Statistical Analysis of a Soap Production Mix in Bejoy Manufacturing Industry, Anambra State, Nigeria
Authors: Okolie Chukwulozie Paul, Iwenofu Chinwe Onyedika, Sinebe Jude Ebieladoh, M. C. Nwosu
Abstract:
The research work is based on the statistical analysis of the processing data. The essence is to analyze the data statistically and to generate a design model for the production mix of soap manufacturing products in Bejoy manufacturing company Nkpologwu, Aguata Local Government Area, Anambra state, Nigeria. The statistical analysis shows the statistical analysis and the correlation of the data. T test, Partial correlation and bi-variate correlation were used to understand what the data portrays. The design model developed was used to model the data production yield and the correlation of the variables show that the R2 is 98.7%. However, the results confirm that the data is fit for further analysis and modeling. This was proved by the correlation and the R-squared.Keywords: General Linear Model, correlation, variables, pearson, significance, T-test, soap, production mix and statistic
Procedia PDF Downloads 44627407 Cloning and Characterization of Uridine-5’-Diphosphate -Glucose Pyrophosphorylases from Lactobacillus Kefiranofaciens and Rhodococcus Wratislaviensis
Authors: Mesfin Angaw Tesfay
Abstract:
Uridine-5’-diphosphate (UDP)-glucose is one of the most versatile building blocks within the metabolism of prokaryotes and eukaryotes serving as an activated sugar donor during the glycosylation of natural products. It is formed by the enzyme UDP-glucose pyrophosphorylase (UGPase) using uridine-5′-triphosphate (UTP) and α-d-glucose 1-phosphate as a substrate. Herein two UGPase genes from Lactobacillus kefiranofaciens ZW3 (LkUGPase) and Rhodococcus wratislaviensis IFP 2016 (RwUGPase) were identified through genome mining approaches. The LkUGPase and RwUGPase have 299 and 306 amino acids, respectively. Both UGPase has the conserved UTP binding site (G-X-G-T-R-X-L-P) and the glucose -1-phosphate binding site (V-E-K-P). The LkUGPase and RwUGPase were cloned in E. coli and SDS-PAGE analysis showed the expression of both enzymes forming about 36 KDa of protein band after induction. LkUGPase and RwUGPase have an activity of 1549.95 and 671.53 U/mg respectively. Currently, their kinetic properties are under investigation.Keywords: UGPase, LkUGPase, RwUGPase, UDP-glucose, Glycosylation
Procedia PDF Downloads 2327406 Identification of Candidate Congenital Heart Defects Biomarkers by Applying a Random Forest Approach on DNA Methylation Data
Authors: Kan Yu, Khui Hung Lee, Eben Afrifa-Yamoah, Jing Guo, Katrina Harrison, Jack Goldblatt, Nicholas Pachter, Jitian Xiao, Guicheng Brad Zhang
Abstract:
Background and Significance of the Study: Congenital Heart Defects (CHDs) are the most common malformation at birth and one of the leading causes of infant death. Although the exact etiology remains a significant challenge, epigenetic modifications, such as DNA methylation, are thought to contribute to the pathogenesis of congenital heart defects. At present, no existing DNA methylation biomarkers are used for early detection of CHDs. The existing CHD diagnostic techniques are time-consuming and costly and can only be used to diagnose CHDs after an infant was born. The present study employed a machine learning technique to analyse genome-wide methylation data in children with and without CHDs with the aim to find methylation biomarkers for CHDs. Methods: The Illumina Human Methylation EPIC BeadChip was used to screen the genome‐wide DNA methylation profiles of 24 infants diagnosed with congenital heart defects and 24 healthy infants without congenital heart defects. Primary pre-processing was conducted by using RnBeads and limma packages. The methylation levels of top 600 genes with the lowest p-value were selected and further investigated by using a random forest approach. ROC curves were used to analyse the sensitivity and specificity of each biomarker in both training and test sample sets. The functionalities of selected genes with high sensitivity and specificity were then assessed in molecular processes. Major Findings of the Study: Three genes (MIR663, FGF3, and FAM64A) were identified from both training and validating data by random forests with an average sensitivity and specificity of 85% and 95%. GO analyses for the top 600 genes showed that these putative differentially methylated genes were primarily associated with regulation of lipid metabolic process, protein-containing complex localization, and Notch signalling pathway. The present findings highlight that aberrant DNA methylation may play a significant role in the pathogenesis of congenital heart defects.Keywords: biomarker, congenital heart defects, DNA methylation, random forest
Procedia PDF Downloads 16027405 Optimization of Real Time Measured Data Transmission, Given the Amount of Data Transmitted
Authors: Michal Kopcek, Tomas Skulavik, Michal Kebisek, Gabriela Krizanova
Abstract:
The operation of nuclear power plants involves continuous monitoring of the environment in their area. This monitoring is performed using a complex data acquisition system, which collects status information about the system itself and values of many important physical variables e.g. temperature, humidity, dose rate etc. This paper describes a proposal and optimization of communication that takes place in teledosimetric system between the central control server responsible for the data processing and storing and the decentralized measuring stations, which are measuring the physical variables. Analyzes of ongoing communication were performed and consequently the optimization of the system architecture and communication was done.Keywords: communication protocol, transmission optimization, data acquisition, system architecture
Procedia PDF Downloads 52127404 Study of Error Analysis and Sources of Uncertainty in the Measurement of Residual Stresses by the X-Ray Diffraction
Authors: E. T. Carvalho Filho, J. T. N. Medeiros, L. G. Martinez
Abstract:
Residual stresses are self equilibrating in a rigid body that acts on the microstructure of the material without application of an external load. They are elastic stresses and can be induced by mechanical, thermal and chemical processes causing a deformation gradient in the crystal lattice favoring premature failure in mechanicals components. The search for measurements with good reliability has been of great importance for the manufacturing industries. Several methods are able to quantify these stresses according to physical principles and the response of the mechanical behavior of the material. The diffraction X-ray technique is one of the most sensitive techniques for small variations of the crystalline lattice since the X-ray beam interacts with the interplanar distance. Being very sensitive technique is also susceptible to variations in measurements requiring a study of the factors that influence the final result of the measurement. Instrumental, operational factors, form deviations of the samples and geometry of analyzes are some variables that need to be considered and analyzed in order for the true measurement. The aim of this work is to analyze the sources of errors inherent to the residual stress measurement process by X-ray diffraction technique making an interlaboratory comparison to verify the reproducibility of the measurements. In this work, two specimens were machined, differing from each other by the surface finishing: grinding and polishing. Additionally, iron powder with particle size less than 45 µm was selected in order to be a reference (as recommended by ASTM E915 standard) for the tests. To verify the deviations caused by the equipment, those specimens were positioned and with the same analysis condition, seven measurements were carried out at 11Ψ tilts. To verify sample positioning errors, seven measurements were performed by positioning the sample at each measurement. To check geometry errors, measurements were repeated for the geometry and Bragg Brentano parallel beams. In order to verify the reproducibility of the method, the measurements were performed in two different laboratories and equipments. The results were statistically worked out and the quantification of the errors.Keywords: residual stress, x-ray diffraction, repeatability, reproducibility, error analysis
Procedia PDF Downloads 18227403 The Public Relations Activities on Social Networking Sites for Communication to the Customer: Case Study the Company in Thailand
Authors: Phakit Treesukol
Abstract:
The purpose of this investigation is to ascertain Internet users’ behaviours towards companies’ public relations activities on social networking sites. In order to conduct a study of Internet users’ behaviour, data was collected using the quota sampling method from a total of 100 Internet users who are members of SNS and used the Internet during the period 10 December 2009 to 9 January 2010. An online self-administrated questionnaire was distributed through Facebook, Hi5 and Twitter to Internet users by using snowball sampling technique. Results of the study showed that the majority of the respondents were using social networking sites with the main purpose to contact their friends. Presently, most of the respondents were not regularly receiving companies’ public relations activities on social networking sites. The highest frequency of survey responses by the respondents was for hiding or deleting information introducing new products or services from companies on SNS also as well.Keywords: media uses and gratification, online activities, public relations activities, social networking sites
Procedia PDF Downloads 257