Search results for: test data compression (TDC)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9641

Search results for: test data compression (TDC)

8441 Anticipation of Bending Reinforcement Based on Iranian Concrete Code Using Meta-Heuristic Tools

Authors: Seyed Sadegh Naseralavi, Najmeh Bemani

Abstract:

In this paper, different concrete codes including America, New Zealand, Mexico, Italy, India, Canada, Hong Kong, Euro Code and Britain are compared with the Iranian concrete design code. First, by using Adaptive Neuro Fuzzy Inference System (ANFIS), the codes having the most correlation with the Iranian ninth issue of the national regulation are determined. Consequently, two anticipated methods are used for comparing the codes: Artificial Neural Network (ANN) and Multi-variable regression. The results show that ANN performs better. Predicting is done by using only tensile steel ratio and with ignoring the compression steel ratio.

Keywords: Concrete design code, anticipate method, artificial neural network, multi-variable regression, adaptive neuro fuzzy inference system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 797
8440 Estimation Model for Concrete Slump Recovery by Using Superplasticizer

Authors: Chaiyakrit Raoupatham, Ram Hari Dhakal, Chalermchai Wanichlamlert

Abstract:

This paper aimed to introduce the solution of concrete slump recovery using chemical admixture type-F (superplasticizer, naphthalene base) to the practice in order to solve unusable concrete problem due to concrete loss its slump, especially for those tropical countries that have faster slump loss rate. In the other hand, randomly adding superplasticizer into concrete can cause concrete to segregate. Therefore, this paper also develops the estimation model used to calculate amount of second dose of superplasticizer need for concrete slump recovery. Fresh properties of ordinary Portland cement concrete with volumetric ratio of paste to void between aggregate (paste content) of 1.1-1.3 with water-cement ratio zone of 0.30 to 0.67 and initial superplasticizer (naphthalene base) of 0.25%-1.6% were tested for initial slump and slump loss for every 30 minutes for one and half hour by slump cone test. Those concretes with slump loss range from 10% to 90% were re-dosed and successfully recovered back to its initial slump. Slump after re-dosed was tested by slump cone test. From the result, it has been concluded that, slump loss was slower for those mix with high initial dose of superplasticizer due to addition of superplasticizer will disturb cement hydration. The required second dose of superplasticizer was affected by two major parameters, which were water-cement ratio and paste content, where lower water-cement ratio and paste content cause an increase in require second dose of superplasticizer. The amount of second dose of superplasticizer is higher as the solid content within the system is increase, solid can be either from cement particles or aggregate. The data was analyzed to form an equation use to estimate the amount of second dosage requirement of superplasticizer to recovery slump to its original.

Keywords: Estimation model, second superplasticizer dosage, slump loss, slump recovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1892
8439 Septic B-spline Collocation Method for Solving One-dimensional Hyperbolic Telegraph Equation

Authors: Marzieh Dosti, Alireza Nazemi

Abstract:

Recently, it is found that telegraph equation is more suitable than ordinary diffusion equation in modelling reaction diffusion for such branches of sciences. In this paper, a numerical solution for the one-dimensional hyperbolic telegraph equation by using the collocation method using the septic splines is proposed. The scheme works in a similar fashion as finite difference methods. Test problems are used to validate our scheme by calculate L2-norm and L∞-norm. The accuracy of the presented method is demonstrated by two test problems. The numerical results are found to be in good agreement with the exact solutions.

Keywords: B-spline, collocation method, second-order hyperbolic telegraph equation, difference schemes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1771
8438 Comparison of ANN and Finite Element Model for the Prediction of Ultimate Load of Thin-Walled Steel Perforated Sections in Compression

Authors: Zhi-Jun Lu, Qi Lu, Meng Wu, Qian Xiang, Jun Gu

Abstract:

The analysis of perforated steel members is a 3D problem in nature, therefore the traditional analytical expressions for the ultimate load of thin-walled steel sections cannot be used for the perforated steel member design. In this study, finite element method (FEM) and artificial neural network (ANN) were used to simulate the process of stub column tests based on specific codes. Results show that compared with those of the FEM model, the ultimate load predictions obtained from ANN technique were much closer to those obtained from the physical experiments. The ANN model for the solving the hard problem of complex steel perforated sections is very promising.

Keywords: Artificial neural network, finite element method, perforated sections, thin-walled steel, ultimate load.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1043
8437 Improved K-Modes for Categorical Clustering Using Weighted Dissimilarity Measure

Authors: S.Aranganayagi, K.Thangavel

Abstract:

K-Modes is an extension of K-Means clustering algorithm, developed to cluster the categorical data, where the mean is replaced by the mode. The similarity measure proposed by Huang is the simple matching or mismatching measure. Weight of attribute values contribute much in clustering; thus in this paper we propose a new weighted dissimilarity measure for K-Modes, based on the ratio of frequency of attribute values in the cluster and in the data set. The new weighted measure is experimented with the data sets obtained from the UCI data repository. The results are compared with K-Modes and K-representative, which show that the new measure generates clusters with high purity.

Keywords: Clustering, categorical data, K-Modes, weighted dissimilarity measure

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3663
8436 Improving Performance of World Wide Web by Adaptive Web Traffic Reduction

Authors: Achuthsankar S. Nair, J. S. Jayasudha

Abstract:

The ever increasing use of World Wide Web in the existing network, results in poor performance. Several techniques have been developed for reducing web traffic by compressing the size of the file, saving the web pages at the client side, changing the burst nature of traffic into constant rate etc. No single method was adequate enough to access the document instantly through the Internet. In this paper, adaptive hybrid algorithms are developed for reducing web traffic. Intelligent agents are used for monitoring the web traffic. Depending upon the bandwidth usage, user-s preferences, server and browser capabilities, intelligent agents use the best techniques to achieve maximum traffic reduction. Web caching, compression, filtering, optimization of HTML tags, and traffic dispersion are incorporated into this adaptive selection. Using this new hybrid technique, latency is reduced to 20 – 60 % and cache hit ratio is increased 40 – 82 %.

Keywords: Bandwidth, Congestion, Intelligent Agents, Prefetching, Web Caching.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1727
8435 Induction Motor Design with Limited Harmonic Currents Using Particle Swarm Optimization

Authors: C. Thanga Raj, S. P. Srivastava, Pramod Agarwal

Abstract:

This paper presents an optimal design of poly-phase induction motor using Quadratic Interpolation based Particle Swarm Optimization (QI-PSO). The optimization algorithm considers the efficiency, starting torque and temperature rise as objective function (which are considered separately) and ten performance related items including harmonic current as constraints. The QI-PSO algorithm was implemented on a test motor and the results are compared with the Simulated Annealing (SA) technique, Standard Particle Swarm Optimization (SPSO), and normal design. Some benchmark problems are used for validating QI-PSO. From the test results QI-PSO gave better results and more suitable to motor-s design optimization. Cµ code is used for implementing entire algorithms.

Keywords: Design, harmonics, induction motor, particle swarm optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1763
8434 Mobile Phone as a Tool for Data Collection in Field Research

Authors: Sandro Mourão, Karla Okada

Abstract:

The necessity of accurate and timely field data is shared among organizations engaged in fundamentally different activities, public services or commercial operations. Basically, there are three major components in the process of the qualitative research: data collection, interpretation and organization of data, and analytic process. Representative technological advancements in terms of innovation have been made in mobile devices (mobile phone, PDA-s, tablets, laptops, etc). Resources that can be potentially applied on the data collection activity for field researches in order to improve this process. This paper presents and discuss the main features of a mobile phone based solution for field data collection, composed of basically three modules: a survey editor, a server web application and a client mobile application. The data gathering process begins with the survey creation module, which enables the production of tailored questionnaires. The field workforce receives the questionnaire(s) on their mobile phones to collect the interviews responses and sending them back to a server for immediate analysis.

Keywords: Data Gathering, Field Research, Mobile Phone, Survey.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2035
8433 Effect of Assumptions of Normal Shock Location on the Design of Supersonic Ejectors for Refrigeration

Authors: Payam Haghparast, Mikhail V. Sorin, Hakim Nesreddine

Abstract:

The complex oblique shock phenomenon can be simply assumed as a normal shock at the constant area section to simulate a sharp pressure increase and velocity decrease in 1-D thermodynamic models. The assumed normal shock location is one of the greatest sources of error in ejector thermodynamic models. Most researchers consider an arbitrary location without justifying it. Our study compares the effect of normal shock place on ejector dimensions in 1-D models. To this aim, two different ejector experimental test benches, a constant area-mixing ejector (CAM) and a constant pressure-mixing (CPM) are considered, with different known geometries, operating conditions and working fluids (R245fa, R141b). In the first step, in order to evaluate the real value of the efficiencies in the different ejector parts and critical back pressure, a CFD model was built and validated by experimental data for two types of ejectors. These reference data are then used as input to the 1D model to calculate the lengths and the diameters of the ejectors. Afterwards, the design output geometry calculated by the 1D model is compared directly with the corresponding experimental geometry. It was found that there is a good agreement between the ejector dimensions obtained by the 1D model, for both CAM and CPM, with experimental ejector data. Furthermore, it is shown that normal shock place affects only the constant area length as it is proven that the inlet normal shock assumption results in more accurate length. Taking into account previous 1D models, the results suggest the use of the assumed normal shock location at the inlet of the constant area duct to design the supersonic ejectors.

Keywords: 1D model, constant area-mixing, constant pressure-mixing, normal shock location, ejector dimensions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 931
8432 On Pooling Different Levels of Data in Estimating Parameters of Continuous Meta-Analysis

Authors: N. R. N. Idris, S. Baharom

Abstract:

A meta-analysis may be performed using aggregate data (AD) or an individual patient data (IPD). In practice, studies may be available at both IPD and AD level. In this situation, both the IPD and AD should be utilised in order to maximize the available information. Statistical advantages of combining the studies from different level have not been fully explored. This study aims to quantify the statistical benefits of including available IPD when conducting a conventional summary-level meta-analysis. Simulated meta-analysis were used to assess the influence of the levels of data on overall meta-analysis estimates based on IPD-only, AD-only and the combination of IPD and AD (mixed data, MD), under different study scenario. The percentage relative bias (PRB), root mean-square-error (RMSE) and coverage probability were used to assess the efficiency of the overall estimates. The results demonstrate that available IPD should always be included in a conventional meta-analysis using summary level data as they would significantly increased the accuracy of the estimates.On the other hand, if more than 80% of the available data are at IPD level, including the AD does not provide significant differences in terms of accuracy of the estimates. Additionally, combining the IPD and AD has moderating effects on the biasness of the estimates of the treatment effects as the IPD tends to overestimate the treatment effects, while the AD has the tendency to produce underestimated effect estimates. These results may provide some guide in deciding if significant benefit is gained by pooling the two levels of data when conducting meta-analysis.

Keywords: Aggregate data, combined-level data, Individual patient data, meta analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1718
8431 Numerical Simulation of Punching Shear of Flat Plates with Low Reinforcement

Authors: Fatema-Tuz-Zahura, Raquib Ahsan

Abstract:

Punching shear failure is usually the governing failure mode of flat plate structures. Punching failure is brittle in nature which induces more vulnerability to this type of structure. In the present study, a 3D finite element model of a flat plate with low reinforcement ratio and without any transverse reinforcement has been developed. Punching shear stress and the deflection data were obtained on the surface of the flat plate as well as through the thickness of the model from numerical simulations. The obtained data were compared with the experimental results. Variation of punching stress with respect to deflection as obtained from numerical results is found to be in good agreement with the experimental results; the range of variation of punching stress is within 5%. The numerical simulation shows an early and gradual onset of nonlinearity, whereas the same is late and abrupt as observed in the experimental results. The range of variation of punching stress for different slab thicknesses between experimental and numerical results is less than 15%. The developed numerical model is useful to complement available punching test series performed in the past. The results obtained from the numerical model will be helpful for designing retrofitting schemes of flat plates.

Keywords: Flat plate, finite element model, punching shear, reinforcement ratio.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1398
8430 Component Comparison of Polyaluminum Chloride Produced from Various Methods

Authors: Wen Po Cheng, Chia Yun Chung, Ruey Fang Yu, Chao Feng Chen

Abstract:

The main objective of this research was to study the differences of aluminum hydrolytic products between two PACl preparation methods. These two methods were the acidification process of freshly formed amorphous Al(OH)3 and the conventional alkalization process of aluminum chloride solution. According to Ferron test and 27Al NMR analysis of those two PACl preparation procedures, the reaction rate constant (k) values and Al13 percentage of acid addition process at high basicity value were both lower than those values of the alkaline addition process. The results showed that the molecular structure and size distribution of the aluminum species in both preparing methods were suspected to be significantly different at high basicity value.

Keywords: Polyaluminum chloride, Al13, amorphous aluminum hydroxide, Ferron test.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1485
8429 Audio Watermarking Using Spectral Modifications

Authors: Jyotsna Singh, Parul Garg, Alok Nath De

Abstract:

In this paper, we present a non-blind technique of adding the watermark to the Fourier spectral components of audio signal in a way such that the modified amplitude does not exceed the maximum amplitude spread (MAS). This MAS is due to individual Discrete fourier transform (DFT) coefficients in that particular frame, which is derived from the Energy Spreading function given by Schroeder. Using this technique one can store double the information within a given frame length i.e. overriding the watermark on the host of equal length with least perceptual distortion. The watermark is uniformly floating on the DFT components of original signal. This helps in detecting any intentional manipulations done on the watermarked audio. Also, the scheme is found robust to various signal processing attacks like presence of multiple watermarks, Additive white gaussian noise (AWGN) and mp3 compression.

Keywords: Discrete Fourier Transform, Spreading Function, Watermark, Pseudo Noise Sequence, Spectral Masking Effect

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1679
8428 DIVAD: A Dynamic and Interactive Visual Analytical Dashboard for Exploring and Analyzing Transport Data

Authors: Tin Seong Kam, Ketan Barshikar, Shaun Tan

Abstract:

The advances in location-based data collection technologies such as GPS, RFID etc. and the rapid reduction of their costs provide us with a huge and continuously increasing amount of data about movement of vehicles, people and goods in an urban area. This explosive growth of geospatially-referenced data has far outpaced the planner-s ability to utilize and transform the data into insightful information thus creating an adverse impact on the return on the investment made to collect and manage this data. Addressing this pressing need, we designed and developed DIVAD, a dynamic and interactive visual analytics dashboard to allow city planners to explore and analyze city-s transportation data to gain valuable insights about city-s traffic flow and transportation requirements. We demonstrate the potential of DIVAD through the use of interactive choropleth and hexagon binning maps to explore and analyze large taxi-transportation data of Singapore for different geographic and time zones.

Keywords: Geographic Information System (GIS), MovementData, GeoVisual Analytics, Urban Planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2363
8427 Gene Expression Data Classification Using Discriminatively Regularized Sparse Subspace Learning

Authors: Chunming Xu

Abstract:

Sparse representation which can represent high dimensional data effectively has been successfully used in computer vision and pattern recognition problems. However, it doesn-t consider the label information of data samples. To overcome this limitation, we develop a novel dimensionality reduction algorithm namely dscriminatively regularized sparse subspace learning(DR-SSL) in this paper. The proposed DR-SSL algorithm can not only make use of the sparse representation to model the data, but also can effective employ the label information to guide the procedure of dimensionality reduction. In addition,the presented algorithm can effectively deal with the out-of-sample problem.The experiments on gene-expression data sets show that the proposed algorithm is an effective tool for dimensionality reduction and gene-expression data classification.

Keywords: sparse representation, dimensionality reduction, labelinformation, sparse subspace learning, gene-expression data classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1426
8426 Assessing Pre-Service Teachers' Computer PhobiaLevels in terms of Gender and Experience, Turkish Sample

Authors: Ö.F. Ursavas, H. Karal

Abstract:

In this study it is aimed to determine the level of preservice teachers- computer phobia. Whether or not computer phobia meaningfully varies statistically according to gender and computer experience has been tested in the study. The study was performed on 430 pre-service teachers at the Education Faculty in Rize/Turkey. Data in the study were collected through the Computer Phobia Scale consisting of the “Personal Knowledge Questionnaire", “Computer Anxiety Rating Scale", and “Computer Thought Survey". In this study, data were analyzed with statistical processes such as t test, and correlation analysis. According to results of statistical analyses, computer phobia of male pre-service teachers does not statistically vary depending on their gender. Although male preservice teachers have higher computer anxiety scores, they have lower computer thought scores. It was also observed that there is a negative and intensive relation between computer experience and computer anxiety. Meanwhile it was found out that pre-service teachers using computer regularly indicated lower computer anxiety. Obtained results were tried to be discussed in terms of the number of computer classes in the Education Faculty curriculum, hours of computer class and the computer availability of student teachers.

Keywords: Computer phobia, computer anxiety, computer thought, pre-service teachers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2202
8425 Horizontal Directivity of Pipa Radiation

Authors: Xin Wang, Yuanzhong Wang

Abstract:

Pipa is one of the most important Chinese traditional plucked instruments, but its directivity has never been measured systematically. In western, directivity of loudness for western instruments is deeply researched through analysis of sound pressure level, whereas the directivity of timbre is seldom studied. In this paper, a new method for directivity of timbre was proposed, and horizontal directivity patterns of loudness and timbre of Pipa were measured. Directivity of Pipa radiation was measured in an anechoic room. The sound of Pipa played by a musician was recorded simultaneously by 32 microphones with Pipa in the center. The measuring results were examined through listening test. According to the measurement of Pipa directivity radiation, we put forward the best localization of Pipa in the Chinese traditional orchestra and the optimal recording region.

Keywords: Directivity, Pipa, Roughness, Listening test.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1743
8424 Power Transformer Noise, Noise Tests, and Example Test Results

Authors: E. Doğan, B. Kekezoğlu

Abstract:

Voltage level must be raised in order to deliver the produced energy to the consumption zones with less loss and less cost. Power transformers used to raise or lower voltage are important parts of the energy transmission system. Power transformers used in switchgear and power generation plants stay in human's intensive habitat zones as a result of expanding cities. Accordingly, noise levels produced by power transformers have begun more and more important and they have established itself as one of the research field. In this research, the noise cause on transformers has been investigated, it's causes has been examined and noise measurement techniques have been introduced. Examples of transformer noise test results are submitted and precautions to be taken were discussed for the purpose of decreasing of the noise which will occurred by transformers.

Keywords: Power transformer, noise measurement, core noise, load noise, fan-pump noise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5654
8423 Determining Cluster Boundaries Using Particle Swarm Optimization

Authors: Anurag Sharma, Christian W. Omlin

Abstract:

Self-organizing map (SOM) is a well known data reduction technique used in data mining. Data visualization can reveal structure in data sets that is otherwise hard to detect from raw data alone. However, interpretation through visual inspection is prone to errors and can be very tedious. There are several techniques for the automatic detection of clusters of code vectors found by SOMs, but they generally do not take into account the distribution of code vectors; this may lead to unsatisfactory clustering and poor definition of cluster boundaries, particularly where the density of data points is low. In this paper, we propose the use of a generic particle swarm optimization (PSO) algorithm for finding cluster boundaries directly from the code vectors obtained from SOMs. The application of our method to unlabeled call data for a mobile phone operator demonstrates its feasibility. PSO algorithm utilizes U-matrix of SOMs to determine cluster boundaries; the results of this novel automatic method correspond well to boundary detection through visual inspection of code vectors and k-means algorithm.

Keywords: Particle swarm optimization, self-organizing maps, clustering, data mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1696
8422 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: Predictive analysis, big data, predictive analysis algorithms. CART algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1043
8421 Development of a Smart System for Measuring Strain Levels of Natural Gas and Petroleum Pipelines on Earthquake Fault Lines in Türkiye

Authors: Ahmet Yetik, Seyit Ali Kara, Cevat Özarpa

Abstract:

Load changes occur on natural gas and oil pipelines due to natural disasters. The displacement of the soil around the natural gas and oil pipes due to situations that may cause erosion, such as earthquakes, landslides, and floods, is the source of this load change. The exposure of natural gas and oil pipes to variable loads causes deformation, cracks, and breaks in these pipes. Such cracks and breaks can cause significant damage to people and the environment, including the risk of explosions. Especially with the examinations made after natural disasters, it can be easily understood which of the pipes has sustained more damage in those quake-affected regions. It has been determined that earthquakes in Türkiye have caused permanent damage to pipelines. This project was initiated in response to the identification of cracks and gas leaks in the insulation gaskets placed in the pipelines, especially at the junction points. In this study, a SCADA (Supervisory Control and Data Acquisition) application has been developed to monitor load changes caused by natural disasters. The developed SCADA application monitors the changes in the x, y, and z axes of the stresses occurring in the pipes with the help of strain gauge sensors placed on the pipes. For the developed SCADA system, test setups in accordance with the standards were created during the fieldwork. The test setups created were integrated into the SCADA system, and the system was followed up. Thanks to the SCADA system developed with the field application, the load changes that will occur on the natural gas and oil pipes are instantly monitored, and the accumulations that may create a load on the pipes and their surroundings are immediately intervened, and new risks that may arise are prevented. It has contributed to energy supply security, asset management, pipeline holistic management, and overall sustainability in the industry.

Keywords: Earthquake, natural gas pipes, oil pipes, voltage measurement, landslide.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 55
8420 Investigating the Dynamics of Knowledge Acquisition in Learning Using Differential Equations

Authors: Gilbert Makanda, Roelf Sypkens

Abstract:

A mathematical model for knowledge acquisition in teaching and learning is proposed. In this study we adopt the mathematical model that is normally used for disease modelling into teaching and learning. We derive mathematical conditions which facilitate knowledge acquisition. This study compares the effects of dropping out of the course at early stages with later stages of learning. The study also investigates effect of individual interaction and learning from other sources to facilitate learning. The study fits actual data to a general mathematical model using Matlab ODE45 and lsqnonlin to obtain a unique mathematical model that can be used to predict knowledge acquisition. The data used in this study was obtained from the tutorial test results for mathematics 2 students from the Central University of Technology, Free State, South Africa in the department of Mathematical and Physical Sciences. The study confirms already known results that increasing dropout rates and forgetting taught concepts reduce the population of knowledgeable students. Increasing teaching contacts and access to other learning materials facilitate knowledge acquisition. The effect of increasing dropout rates is more enhanced in the later stages of learning than earlier stages. The study opens up a new direction in further investigations in teaching and learning using differential equations.

Keywords: Differential equations, knowledge acquisition, least squares nonlinear, dynamical systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 892
8419 A Business-to-Business Collaboration System That Promotes Data Utilization While Encrypting Information on the Blockchain

Authors: Hiroaki Nasu, Ryota Miyamoto, Yuta Kodera, Yasuyuki Nogami

Abstract:

To promote Industry 4.0 and Society 5.0 and so on, it is important to connect and share data so that every member can trust it. Blockchain (BC) technology is currently attracting attention as the most advanced tool and has been used in the financial field and so on. However, the data collaboration using BC has not progressed sufficiently among companies on the supply chain of the manufacturing industry that handle sensitive data such as product quality, manufacturing conditions, etc. There are two main reasons why data utilization is not sufficiently advanced in the industrial supply chain. The first reason is that manufacturing information is top secret and a source for companies to generate profits. It is difficult to disclose data even between companies with transactions in the supply chain. Blockchain mechanism such as Bitcoin using Public Key Infrastructure (PKI) requires plaintext to be shared between companies in order to verify the identity of the company that sent the data. Another reason is that the merits (scenarios) of collaboration data between companies are not specifically specified in the industrial supply chain. For these problems, this paper proposes a Business to Business (B2B) collaboration system using homomorphic encryption and BC technique. Using the proposed system, each company on the supply chain can exchange confidential information on encrypted data and utilize the data for their own business. In addition, this paper considers a scenario focusing on quality data, which was difficult to collaborate because it is top-secret. In this scenario, we show an implementation scheme and a benefit of concrete data collaboration by proposing a comparison protocol that can grasp the change in quality while hiding the numerical value of quality data.

Keywords: Business to business data collaboration, industrial supply chain, blockchain, homomorphic encryption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 773
8418 An Approximation of Daily Rainfall by Using a Pixel Value Data Approach

Authors: Sarisa Pinkham, Kanyarat Bussaban

Abstract:

The research aims to approximate the amount of daily rainfall by using a pixel value data approach. The daily rainfall maps from the Thailand Meteorological Department in period of time from January to December 2013 were the data used in this study. The results showed that this approach can approximate the amount of daily rainfall with RMSE=3.343.

Keywords: Daily rainfall, Image processing, Approximation, Pixel value data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1735
8417 Automatic Generation of Ontology from Data Source Directed by Meta Models

Authors: Widad Jakjoud, Mohamed Bahaj, Jamal Bakkas

Abstract:

Through this paper we present a method for automatic generation of ontological model from any data source using Model Driven Architecture (MDA), this generation is dedicated to the cooperation of the knowledge engineering and software engineering. Indeed, reverse engineering of a data source generates a software model (schema of data) that will undergo transformations to generate the ontological model. This method uses the meta-models to validate software and ontological models.

Keywords: Meta model, model, ontology, data source.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1974
8416 Steps towards the Development of National Health Data Standards in Developing Countries: An Exploratory Qualitative Study in Saudi Arabia

Authors: Abdullah I. Alkraiji, Thomas W. Jackson, Ian R. Murray

Abstract:

The proliferation of health data standards today is somewhat overlapping and conflicting, resulting in market confusion and leading to increasing proprietary interests. The government role and support in standardization for health data are thought to be crucial in order to establish credible standards for the next decade, to maximize interoperability across the health sector, and to decrease the risks associated with the implementation of non-standard systems. The normative literature missed out the exploration of the different steps required to be undertaken by the government towards the development of national health data standards. Based on the lessons learned from a qualitative study investigating the different issues to the adoption of health data standards in the major tertiary hospitals in Saudi Arabia and the opinions and feedback from different experts in the areas of data exchange and standards and medical informatics in Saudi Arabia and UK, a list of steps required towards the development of national health data standards was constructed. Main steps are the existence of: a national formal reference for health data standards, an agreed national strategic direction for medical data exchange, a national medical information management plan and a national accreditation body, and more important is the change management at the national and organizational level. The outcome of this study can be used by academics and practitioners to develop the planning of health data standards, and in particular those in developing countries.

Keywords: Interoperability, Case Study, Health Data Standards, Medical Data Exchange, Saudi Arabia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1980
8415 Cascade Kalman Filter Configuration for Low Cost IMU/GPS Integration in Car Navigation Like Robot

Authors: Othman Maklouf, Abdurazag Ghila, Ahmed Abdulla

Abstract:

This paper introduces a low cost INS/GPS algorithm for land vehicle navigation application. The data fusion process is done with an extended Kalman filter in cascade configuration mode. In order to perform numerical simulations, MATLAB software has been developed. Loosely coupled configuration is considered. The results obtained in this work demonstrate that a low-cost INS/GPS navigation system is partially capable of meeting the performance requirements for land vehicle navigation. The relative effectiveness of the kalman filter implementation in integrated GPS/INS navigation algorithm is highlighted. The paper also provides experimental results; field test using a car is carried out.

Keywords: GPS, INS, IMU, Kalman filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3827
8414 CSR of top Portuguese Companies: Relation between Social Performance and Economic Performance

Authors: Afonso, S. C., Fernandes, P. O., Monte, A. P.

Abstract:

Modern times call organizations to have an active role in the social arena, through Corporate Social Responsibility (CSR). The objective of this research was to test the hypothesis that there is a positive relation between social performance and economic performance, and if there is a positive correlation between social performance and financial-economic performance. To test these theories a measure of social performance, based on the Green Book of Commission of the European Community, was used in a group of nineteen Portuguese top companies, listed on the PSI 20 index, through a period of five years, since 2005 to 2009. A clusters analysis was applied to group companies by their social performance and to compare and correlate their economic performance. Results indicate that companies that had a better social performance are not the ones who had a better economic performance, and suggest that the middle path might provide a good relation CSR-Economic performance, as a basis to a sustainable development.

Keywords: Corporate Social Responsibility, Economic Performance, Win-Win relationship

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2391
8413 Morphemic Analysis Awareness: A Boon or Bane on ESL Students’ Vocabulary Learning Strategy

Authors: Chandrakala Varatharajoo, Adelina Binti Asmawi, Nabeel Abdallah Mohammad Abedalaziz

Abstract:

This study investigated the impact of inflectional and derivational morphemic analysis awareness on ESL secondary school students’ vocabulary learning strategy. The quasi-experimental study was conducted with 106 low proficiency secondary school students in two experimental groups (inflectional and derivational) and one control group. The students’ vocabulary acquisition was assessed through two measures: Morphemic Analysis Test and Vocabulary- Morphemic Test in the pretest and posttest before and after an intervention programme. Results of ANCOVA revealed that both the experimental groups achieved a significant score in Morphemic Analysis Test and Vocabulary-Morphemic Test. However, the inflectional group obtained a fairly higher score than the derivational group. Thus, the results indicated that ESL low proficiency secondary school students performed better on inflectional morphemic awareness as compared to derivatives. The results also showed that the awareness of inflectional morphology contributed more on the vocabulary acquisition. Importantly, learning inflectional morphology can help ESL low proficiency secondary school students to develop both morphemic awareness and vocabulary gain. Theoretically, these findings show that not all morphemes are equally useful to students for their language development. Practically, these findings indicate that morphological instruction should at least be included in remediation and instructional efforts with struggling learners across all grade levels, allowing them to focus on meaning within the word before they attempt the text in large for better comprehension. Also, by methodologically, by conducting individualized intervention and assessment this study provided fresh empirical evidence to support the existing literature on morphemic analysis awareness and vocabulary learning strategy. Thus, a major pedagogical implication of the study is that morphemic analysis awareness strategy is a definite boon for ESL secondary school students in learning English vocabulary.

Keywords: ESL, instruction, morphemic analysis, vocabulary.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2866
8412 Optical Switching Based On Bragg Solitons in A Nonuniform Fiber Bragg Grating

Authors: Abdulatif Abdusalam, Mohamed Shaban

Abstract:

In this paper, we consider the nonlinear pulse propagation through a nonuniform birefringent fiber Bragg grating (FBG) whose index modulation depth varies along the propagation direction. Here, the pulse propagation is governed by the nonlinear birefringent coupled mode (NLBCM) equations. To form the Bragg soliton outside the photonic bandgap (PBG), the NLBCM equations are reduced to the well known NLS type equation by multiple scale analysis. As we consider the pulse propagation in a nonuniform FBG, the pulse propagation outside the PBG is governed by inhomogeneous NLS (INLS) rather than NLS. We then discuss the formation of soliton in the FBG known as Bragg soliton whose central frequency lies outside but close to the PBG of the grating structure. Further, we discuss Bragg soliton compression due to a delicate balance between the SPM and the varying grating induced dispersion. In addition, Bragg soliton collision, Bragg soliton switching and possible logic gates have also been discussed.

Keywords: Bragg grating, Nonuniform fiber, Nonlinear pulse.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1878