Search results for: Runge Kutta methods.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4041

Search results for: Runge Kutta methods.

3741 Robust ANOVA: An Illustrative Study in Horticultural Crop Research

Authors: Dinesh Inamadar, R. Venugopalan, K. Padmini

Abstract:

An attempt has been made in the present communication to elucidate the efficacy of robust ANOVA methods to analyse horticultural field experimental data in the presence of outliers. Results obtained fortify the use of robust ANOVA methods as there was substantiate reduction in error mean square, and hence the probability of committing Type I error, as compared to the regular approach.

Keywords: Outliers, robust ANOVA, horticulture, Cook distance, Type I error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2302
3740 The Use of Process-Oriented Methods of Calculation to Determine the Costs of Logistics Processes

Authors: Tomas Cechura, Michal Simon

Abstract:

The aim of this paper is to create a proposal for determining the costs of logistics processes by using process-oriented calculation methods. The traditional approach is that logistics costs are part of manufacturing overhead which is usually calculated as a percentage surcharge. Therefore in the traditional approach it is not obvious where and in which activities costs were incurred. So it is impossible to trace logistics costs to products. Our point of view is trying to fix or at least improve this issue. Another benefit of applying the process approach is identification of logistics processes which are otherwise part of manufacturing overhead. In the first part this paper describes the development of process-oriented methods over time. The next part shows the possibility of implementing the process-oriented method called Prozesskostenrechnung to logistics processes. The conclusion summarizes advantages and disadvantages of using this method in logistics.

Keywords: Cost, logistics, calculation, process-oriented method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1591
3739 Construction Methods for Sign Patterns Allowing Nilpotence of Index k

Authors: Jun Luo

Abstract:

In this paper, the smallest such integer k is called by the index (of nilpotence) of B such that Bk = 0. In this paper, we study sign patterns allowing nilpotence of index k and obtain four methods to construct sign patterns allowing nilpotence of index at most k, which generalizes some recent results.

Keywords: Sign pattern, Nilpotence, Jordan block.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1648
3738 Introduction of Self-Healing Concrete and Different Methods of Its Scientific Implementation

Authors: Davoud Beheshtizadeh, Davood Jafari

Abstract:

Concrete, with its unique properties and advantages, has gained widespread and increasing use in the construction industry, particularly in a country's infrastructure. However, concrete exhibits certain defects, most notably the presence of micro-cracks that occur after the setting process, leading to increased costs for infrastructure repair and maintenance. As a result, self-healing concretes have garnered attention in various countries in recent years. These concretes employ different mechanisms for repair, including physical, chemical, biological, and combined approaches, each with its own subsets and implementation methods. Certain mechanisms hold significant importance, leading to specialized production methods. Given the novelty of this subject in Iran, there is limited knowledge or, in some cases, a complete lack of understanding. This paper presents various self-healing concrete mechanisms and the advantages, disadvantages, and application scope of each method.

Keywords: Micro-cracks, self-healing concrete, microcapsules, concrete, cement, self-sensitive.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 234
3737 A Hybrid Feature Selection by Resampling, Chi squared and Consistency Evaluation Techniques

Authors: Amir-Massoud Bidgoli, Mehdi Naseri Parsa

Abstract:

In this paper a combined feature selection method is proposed which takes advantages of sample domain filtering, resampling and feature subset evaluation methods to reduce dimensions of huge datasets and select reliable features. This method utilizes both feature space and sample domain to improve the process of feature selection and uses a combination of Chi squared with Consistency attribute evaluation methods to seek reliable features. This method consists of two phases. The first phase filters and resamples the sample domain and the second phase adopts a hybrid procedure to find the optimal feature space by applying Chi squared, Consistency subset evaluation methods and genetic search. Experiments on various sized datasets from UCI Repository of Machine Learning databases show that the performance of five classifiers (Naïve Bayes, Logistic, Multilayer Perceptron, Best First Decision Tree and JRIP) improves simultaneously and the classification error for these classifiers decreases considerably. The experiments also show that this method outperforms other feature selection methods.

Keywords: feature selection, resampling, reliable features, Consistency Subset Evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2579
3736 Panoramic Sensor Based Blind Spot Accident Prevention System

Authors: Rajendra Prasad Mahapatra, K. Vimal Kumar

Abstract:

There are many automotive accidents due to blind spots and driver inattentiveness. Blind spot is the area that is invisible to the driver's viewpoint without head rotation. Several methods are available for assisting the drivers. Simplest methods are — rear mirrors and wide-angle lenses. But, these methods have a disadvantage of the requirement for human assistance. So, the accuracy of these devices depends on driver. Another approach called an automated approach that makes use of sensors such as sonar or radar. These sensors are used to gather range information. The range information will be processed and used for detecting the collision. The disadvantage of this system is — low angular resolution and limited sensing volumes. This paper is a panoramic sensor based automotive vehicle monitoring..

Keywords: Panoramic sensors, Blind spot, Convex lens, Computer Vision, Sonar.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2110
3735 Shape Error Concealment for Shape Independent Transform Coding

Authors: Sandra Ondrušová, Jaroslav Polec

Abstract:

Arbitrarily shaped video objects are an important concept in modern video coding methods. The techniques presently used are not based on image elements but rather video objects having an arbitrary shape. In this paper, spatial shape error concealment techniques to be used for object-based image in error-prone environments are proposed. We consider a geometric shape representation consisting of the object boundary, which can be extracted from the α-plane. Three different approaches are used to replace a missing boundary segment: Bézier interpolation, Bézier approximation and NURBS approximation. Experimental results on object shape with different concealment difficulty demonstrate the performance of the proposed methods. Comparisons with proposed methods are also presented.

Keywords: error concealment, shape coding, object-based image, NURBS, Bézier curves.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1284
3734 Resource Efficiency within Current Production

Authors: Sarah Majid Ansari, Serjosha Wulf, Matthias Görke

Abstract:

In times of global warming and the increasing shortage of resources, sustainable production is becoming more and more inevitable. Companies cannot only heighten their competitiveness but also contribute positively to environmental protection through efficient energy and resource consumption. Regarding this, technical solutions are often preferred during production, although organizational and process-related approaches also offer great potential. This project focuses on reducing resource usage, with a special emphasis on the human factor. It is the aspiration to develop a methodology that systematically implements and embeds suitable and individual measures and methods regarding resource efficiency throughout the entire production. The measures and methods established help employees handle resources and energy more sensitively. With this in mind, this paper also deals with the difficulties that can occur during the sensitization of employees and the implementation of these measures and methods. In addition, recommendations are given on how to avoid such difficulties.

Keywords: Implementation, human factor, production plant, resource efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1972
3733 Wasting Human and Computer Resources

Authors: Mária Csernoch, Piroska Biró

Abstract:

The legends about “user-friendly” and “easy-to-use” birotical tools (computer-related office tools) have been spreading and misleading end-users. This approach has led us to the extremely high number of incorrect documents, causing serious financial losses in the creating, modifying, and retrieving processes. Our research proved that there are at least two sources of this underachievement: (1) The lack of the definition of the correctly edited, formatted documents. Consequently, end-users do not know whether their methods and results are correct or not. They are not aware of their ignorance. They are so ignorant that their ignorance does not allow them to realize their lack of knowledge. (2) The end-users’ problem solving methods. We have found that in non-traditional programming environments end-users apply, almost exclusively, surface approach metacognitive methods to carry out their computer related activities, which are proved less effective than deep approach methods. Based on these findings we have developed deep approach methods which are based on and adapted from traditional programming languages. In this study, we focus on the most popular type of birotical documents, the text based documents. We have provided the definition of the correctly edited text, and based on this definition, adapted the debugging method known in programming. According to the method, before the realization of text editing, a thorough debugging of already existing texts and the categorization of errors are carried out. With this method in advance to real text editing users learn the requirements of text based documents and also of the correctly formatted text. The method has been proved much more effective than the previously applied surface approach methods. The advantages of the method are that the real text handling requires much less human and computer sources than clicking aimlessly in the GUI (Graphical User Interface), and the data retrieval is much more effective than from error-prone documents.

Keywords: Deep approach metacognitive methods, error-prone birotical documents, financial losses, human and computer resources.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1908
3732 En-Face Optical Coherence Tomography and Fluorescence in Evaluation of Orthodontic Interfaces

Authors: R. O. Rominu, C. Sinescu, D.M. Pop, M. Hughes, A. Bradu, M. Rominu, A. Gh. Podoleanu

Abstract:

Bonding has become a routine procedure in several dental specialties – from prosthodontics to conservative dentistry and even orthodontics. In many of these fields it is important to be able to investigate the bonded interfaces to assess their quality. All currently employed investigative methods are invasive, meaning that samples are destroyed in the testing procedure and cannot be used again. We have investigated the interface between human enamel and bonded ceramic brackets non-invasively, introducing a combination of new investigative methods – optical coherence tomography (OCT), fluorescence OCT and confocal microscopy (CM). Brackets were conventionally bonded on conditioned buccal surfaces of teeth. The bonding was assessed using these methods. Three dimensional reconstructions of the detected material defects were developed using manual and semi-automatic segmentation. The results clearly prove that OCT, fluorescence OCT and CM are useful in orthodontic bonding investigations.

Keywords: Optical coherence tomography, Confocal Microscopy, Orthodontic Bonding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1671
3731 Adopted Method of Information System Strategy for Knowledge Management System: A Literature Review

Authors: Elin Cahyaningsih, Dana Indra Sensuse, Wahyu Catur Wibowo, Sofiyanti Indriasari

Abstract:

Bureaucracy reform program drives Indonesian government to change their management to enhance their organizational performance. Information technology became one of strategic plan that organization tried to improve. Knowledge management system is one of information system that supporting knowledge management implementation in government which categorized as people perspective, because this system has high dependency in human interaction and participation. Strategic plan for developing knowledge management system can be determine using some of information system strategic methods. This research conducted to define type of strategic method of information system, stage of activity each method, strength and weakness. Literature review methods used to identify and classify strategic methods of information system, differentiate method type, categorize common activities, strength and weakness. Result of this research are determine and compare six strategic information system methods, Balanced Scorecard and Risk Analysis believe as common strategic method that usually used and have the highest excellence strength.

Keywords: Knowledge management system, balanced scorecard, five force, risk analysis, gap analysis, value chain analysis, SWOT analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2637
3730 New Analysis Methods on Strict Avalanche Criterion of S-Boxes

Authors: Phyu Phyu Mar, Khin Maung Latt

Abstract:

S-boxes (Substitution boxes) are keystones of modern symmetric cryptosystems (block ciphers, as well as stream ciphers). S-boxes bring nonlinearity to cryptosystems and strengthen their cryptographic security. They are used for confusion in data security An S-box satisfies the strict avalanche criterion (SAC), if and only if for any single input bit of the S-box, the inversion of it changes each output bit with probability one half. If a function (cryptographic transformation) is complete, then each output bit depends on all of the input bits. Thus, if it were possible to find the simplest Boolean expression for each output bit in terms of the input bits, each of these expressions would have to contain all of the input bits if the function is complete. From some important properties of S-box, the most interesting property SAC (Strict Avalanche Criterion) is presented and to analyze this property three analysis methods are proposed.

Keywords: S-boxes, cryptosystems, strict avalanche criterion, function, analysis methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3916
3729 A Study of Quality Assurance and Unit Verification Methods in Safety Critical Environment

Authors: Miklos Taliga

Abstract:

In the present case study we examined the development and testing methods of systems that contain safety-critical elements in different industrial fields. Consequentially, we observed the classical object-oriented development and testing environment, as both medical technology and automobile industry approaches the development of safety critical elements that way. Subsequently, we examined model-based development. We introduce the quality parameters that define development and testing. While taking modern agile methodology (scrum) into consideration, we examined whether and to what extent the methodologies we found fit into this environment.

Keywords: Safety-critical elements, quality management, unit verification, model base testing, agile methods, scrum, metamodel, object-oriented programming, field specific modelling, sprint, user story, UML Standard.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 825
3728 Multidimensional Performance Tracking

Authors: C. Ardil

Abstract:

In this study, a model, together with a software tool that implements it, has been developed to determine the performance ratings of employees in an organization operating in the information technology sector using the indicators obtained from employees' online study data. Weighted Sum (WS) Method and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method based on multidimensional decision making approach were used in the study. WS and TOPSIS methods provide multidimensional decision making (MDDM) methods that allow all dimensions to be evaluated together considering specific weights, allowing employees to objectively evaluate the problem of online performance tracking. The application of WS and TOPSIS mathematical methods, which can combine alternatives with a large number of dimensions and reach simultaneous solution, has been implemented through an online performance tracking software. In the application of WS and TOPSIS methods, objective dimension weights were calculated by using entropy information (EI) and standard deviation (SD) methods from the data obtained by employees' online performance tracking method, decision matrix was formed by using performance scores for each employee, and a single performance score was calculated for each employee. Based on the calculated performance score, employees were given a performance evaluation decision. The results of Pareto set evidence and comparative mathematical analysis validate that employees' performance preference rankings in WS and TOPSIS methods are closely related. This suggests the compatibility, applicability, and validity of the proposed method to the MDDM problems in which a large number of alternative and dimension types are taken into account. With this study, an objective, realistic, feasible and understandable mathematical method, together with a software tool that implements it has been demonstrated. This is considered to be preferable because of the subjectivity, limitations and high cost of the methods traditionally used in the measurement and performance appraisal in the information technology sector.

Keywords: Weighted sum, entropy ınformation, standard deviation, online performance tracking, performance evaluation, performance management, multidimensional decision making.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1105
3727 Control Analysis Using Tuning Methods for a Designed, Developed and Modeled Cross Flow Water Tube Heat Exchanger

Authors: Shaival H. Nagarsheth, Utpal Pandya, Hemant J. Nagarsheth

Abstract:

Cross flow water tube heat exchanger can be designed and made operational using methods of model building and simulation of the system. This paper projects the design and development of a model of cross flow water tube heat-exchanger system, simulation and validation of control analysis of different tuning methods. Feedback and override control system is developed using inputs acquired with the help of sensory system. A mathematical model is formulated for analysis of system behaviour. The temperature is regulated at the desired set point automatically.

Keywords: Heat Exchanger, Feedback, Override, Temperature, PID.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2402
3726 Development of Workplace Environmental Monitoring Systems Using Ubiquitous Sensor Network

Authors: Jung-Min Yun, Jong-Hyun Baek, Byoung Ky Kang, Peom Park

Abstract:

In this study, workplace environmental monitoring systems were established using USN(Ubiquitous Sensor Networks) and LabVIEW. Although existing direct sampling methods enable finding accurate values as of the time points of measurement, those methods are disadvantageous in that continuous management and supervision are difficult and costs for are high when those methods are used. Therefore, the efficiency and reliability of workplace management by supervisors are relatively low when those methods are used. In this study, systems were established so that information on workplace environmental factors such as temperatures, humidity and noises is measured and transmitted to the PC in real time to enable supervisors to monitor workplaces through LabVIEW on the PC. When any accidents have occurred in workplaces, supervisors can immediately respond through the monitoring system and this system enables integrated workplace management and the prevention of safety accidents. By introducing these monitoring systems, safety accidents due to harmful environmental factors in workplaces can be prevented and these monitoring systems will be also helpful in finding out the correlation between safety accidents and occupational diseases by comparing and linking databases established by this monitoring system with existing statistical data.

Keywords: Ubiquitous Sensor Nework, LabVIEW, Environment Monitoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2542
3725 Performance Evaluation of ROI Extraction Models from Stationary Images

Authors: K.V. Sridhar, Varun Gunnala, K.S.R Krishna Prasad

Abstract:

In this paper three basic approaches and different methods under each of them for extracting region of interest (ROI) from stationary images are explored. The results obtained for each of the proposed methods are shown, and it is demonstrated where each method outperforms the other. Two main problems in ROI extraction: the channel selection problem and the saliency reversal problem are discussed and how best these two are addressed by various methods is also seen. The basic approaches are 1) Saliency based approach 2) Wavelet based approach 3) Clustering based approach. The saliency approach performs well on images containing objects of high saturation and brightness. The wavelet based approach performs well on natural scene images that contain regions of distinct textures. The mean shift clustering approach partitions the image into regions according to the density distribution of pixel intensities. The experimental results of various methodologies show that each technique performs at different acceptable levels for various types of images.

Keywords: clustering, ROI, saliency, wavelets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1407
3724 Ultrasonic Echo Image Adaptive Watermarking Using the Just-Noticeable Difference Estimation

Authors: Amnach Khawne, Kazuhiko Hamamoto, Orachat Chitsobhuk

Abstract:

Most of the image watermarking methods, using the properties of the human visual system (HVS), have been proposed in literature. The component of the visual threshold is usually related to either the spatial contrast sensitivity function (CSF) or the visual masking. Especially on the contrast masking, most methods have not mention to the effect near to the edge region. Since the HVS is sensitive what happens on the edge area. This paper proposes ultrasound image watermarking using the visual threshold corresponding to the HVS in which the coefficients in a DCT-block have been classified based on the texture, edge, and plain area. This classification method enables not only useful for imperceptibility when the watermark is insert into an image but also achievable a robustness of watermark detection. A comparison of the proposed method with other methods has been carried out which shown that the proposed method robusts to blockwise memoryless manipulations, and also robust against noise addition.

Keywords: Medical image watermarking, Human Visual System, Image Adaptive Watermark

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1598
3723 Evaluation of Iranian Standard for Assessment of Liquefaction Potential of Cohesionless Soils Based on Standard Penetration Test

Authors: Reza Ziaie Moayad, Azam Kouhpeyma

Abstract:

In-situ testing is preferred to evaluate the liquefaction potential in cohesionless soils due to high disturbance during sampling. Although new in-situ methods with high accuracy have been developed, standard penetration test, the simplest and the oldest in-situ test, is still used due to the profusion of the recorded data. This paper reviews the Iranian standard of evaluating liquefaction potential in soils (codes 525) and compares the liquefaction assessment methods based on standard penetration test (SPT) results on cohesionless soil in this standard with the international standards. To this, methods for assessing liquefaction potential are compared with what is presented in standard 525. It is found that although the procedure used in Iranian standard of evaluating the potential of liquefaction has not been updated according to the new findings, it is a conservative procedure.

Keywords: cohesionless soil, liquefaction, SPT, Iranian liquefaction standard

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 481
3722 A Comparison between Russian and Western Approach for Deep Foundation Design

Authors: Saeed Delara, Kendra MacKay

Abstract:

Varying methodologies are considered for pile design for both Russian and Western approaches. Although both approaches rely on toe and side frictional resistances, different calculation methods are proposed to estimate pile capacity. The Western approach relies on compactness (internal friction angle) of soil for cohesionless soils and undrained shear strength for cohesive soils. The Russian approach relies on grain size for cohesionless soils and liquidity index for cohesive soils. Though most recommended methods in the Western approaches are relatively simple methods to predict pile settlement, the Russian approach provides a detailed method to estimate single pile and pile group settlement. Details to calculate pile axial capacity and settlement using the Russian and Western approaches are discussed and compared against field test results.

Keywords: Pile capacity, pile settlement, Russian approach, western approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 852
3721 Unsupervised Classification of DNA Barcodes Species Using Multi-Library Wavelet Networks

Authors: Abdesselem Dakhli, Wajdi Bellil, Chokri Ben Amar

Abstract:

DNA Barcode provides good sources of needed information to classify living species. The classification problem has to be supported with reliable methods and algorithms. To analyze species regions or entire genomes, it becomes necessary to use the similarity sequence methods. A large set of sequences can be simultaneously compared using Multiple Sequence Alignment which is known to be NP-complete. However, all the used methods are still computationally very expensive and require significant computational infrastructure. Our goal is to build predictive models that are highly accurate and interpretable. In fact, our method permits to avoid the complex problem of form and structure in different classes of organisms. The empirical data and their classification performances are compared with other methods. Evenly, in this study, we present our system which is consisted of three phases. The first one, is called transformation, is composed of three sub steps; Electron-Ion Interaction Pseudopotential (EIIP) for the codification of DNA Barcodes, Fourier Transform and Power Spectrum Signal Processing. Moreover, the second phase step is an approximation; it is empowered by the use of Multi Library Wavelet Neural Networks (MLWNN). Finally, the third one, is called the classification of DNA Barcodes, is realized by applying the algorithm of hierarchical classification.

Keywords: DNA Barcode, Electron-Ion Interaction Pseudopotential, Multi Library Wavelet Neural Networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1964
3720 Dialogue Meetings as an Arena for Collaboration and Reflection among Researchers and Practitioners

Authors: Kerstin Grunden, Ann Svensson, Berit Forsman, Christina Karlsson, Ayman Obeid

Abstract:

The research question of the article is to explore whether the dialogue meetings method could be relevant for reflective learning among researchers and practitioners when welfare technology should be implemented in municipalities, or not. A testbed was planned to be implemented in a retirement home in a Swedish municipality, and the practitioners worked with a pre-study of that testbed. In the article, the dialogue between the researchers and the practitioners in the dialogue meetings is described and analyzed. The potential of dialogue meetings as an arena for learning and reflection among researchers and practitioners is discussed. The research methodology approach is participatory action research with mixed methods (dialogue meetings, focus groups, participant observations). The main findings from the dialogue meetings were that the researchers learned more about the use of traditional research methods, and the practitioners learned more about how they could improve their use of the methods to facilitate change processes in their organization. These findings have the potential both for the researchers and the practitioners to result in more relevant use of research methods in change processes in organizations. It is concluded that dialogue meetings could be relevant for reflective learning among researchers and practitioners when welfare technology should be implemented in a health care organization.

Keywords: Dialogue meetings, implementation, reflection, test bed, welfare technology, participatory action research.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 455
3719 Optimized Calculation of Hourly Price Forward Curve (HPFC)

Authors: Ahmed Abdolkhalig

Abstract:

This paper examines many mathematical methods for molding the hourly price forward curve (HPFC); the model will be constructed by numerous regression methods, like polynomial regression, radial basic function neural networks & a furrier series. Examination the models goodness of fit will be done by means of statistical & graphical tools. The criteria for choosing the model will depend on minimize the Root Mean Squared Error (RMSE), using the correlation analysis approach for the regression analysis the optimal model will be distinct, which are robust against model misspecification. Learning & supervision technique employed to determine the form of the optimal parameters corresponding to each measure of overall loss. By using all the numerical methods that mentioned previously; the explicit expressions for the optimal model derived and the optimal designs will be implemented.

Keywords: Forward curve, furrier series, regression, radial basic function neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4224
3718 Groundwater Quality Improvement by Using Aeration and Filtration Methods

Authors: Nik N. Nik Daud, Nur H. Izehar, B. Yusuf, Thamer A. Mohamed, A. Ahsan

Abstract:

An experiment was conducted using two aeration methods (water-into-air and air-into-water) and followed by filtration processes using manganese greensand material. The properties of groundwater such as pH, dissolved oxygen, turbidity and heavy metal concentration (iron and manganese) will be assessed. The objectives of this study are i) to determine the effective aeration method and ii) to assess the effectiveness of manganese greensand as filter media in removing iron and manganese concentration in groundwater. Results showed that final pH for all samples after treatment are in range from 7.40 and 8.40. Both aeration methods increased the dissolved oxygen content. Final turbidity for groundwater samples are between 3 NTU to 29 NTU. Only three out of eight samples achieved iron concentration of 0.3mg/L and less and all samples reach manganese concentration of 0.1mg/L and less. Air-into-water aeration method gives higher percentage of iron and manganese removal compare to water-into-air method.

Keywords: Aeration, filtration, groundwater, water quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4107
3717 Validation and Selection between Machine Learning Technique and Traditional Methods to Reduce Bullwhip Effects: a Data Mining Approach

Authors: Hamid R. S. Mojaveri, Seyed S. Mousavi, Mojtaba Heydar, Ahmad Aminian

Abstract:

The aim of this paper is to present a methodology in three steps to forecast supply chain demand. In first step, various data mining techniques are applied in order to prepare data for entering into forecasting models. In second step, the modeling step, an artificial neural network and support vector machine is presented after defining Mean Absolute Percentage Error index for measuring error. The structure of artificial neural network is selected based on previous researchers' results and in this article the accuracy of network is increased by using sensitivity analysis. The best forecast for classical forecasting methods (Moving Average, Exponential Smoothing, and Exponential Smoothing with Trend) is resulted based on prepared data and this forecast is compared with result of support vector machine and proposed artificial neural network. The results show that artificial neural network can forecast more precisely in comparison with other methods. Finally, forecasting methods' stability is analyzed by using raw data and even the effectiveness of clustering analysis is measured.

Keywords: Artificial Neural Networks (ANN), bullwhip effect, demand forecasting, Support Vector Machine (SVM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2007
3716 Comparison of Methods of Estimation for Use in Goodness of Fit Tests for Binary Multilevel Models

Authors: I. V. Pinto, M. R. Sooriyarachchi

Abstract:

It can be frequently observed that the data arising in our environment have a hierarchical or a nested structure attached with the data. Multilevel modelling is a modern approach to handle this kind of data. When multilevel modelling is combined with a binary response, the estimation methods get complex in nature and the usual techniques are derived from quasi-likelihood method. The estimation methods which are compared in this study are, marginal quasi-likelihood (order 1 & order 2) (MQL1, MQL2) and penalized quasi-likelihood (order 1 & order 2) (PQL1, PQL2). A statistical model is of no use if it does not reflect the given dataset. Therefore, checking the adequacy of the fitted model through a goodness-of-fit (GOF) test is an essential stage in any modelling procedure. However, prior to usage, it is also equally important to confirm that the GOF test performs well and is suitable for the given model. This study assesses the suitability of the GOF test developed for binary response multilevel models with respect to the method used in model estimation. An extensive set of simulations was conducted using MLwiN (v 2.19) with varying number of clusters, cluster sizes and intra cluster correlations. The test maintained the desirable Type-I error for models estimated using PQL2 and it failed for almost all the combinations of MQL. Power of the test was adequate for most of the combinations in all estimation methods except MQL1. Moreover, models were fitted using the four methods to a real-life dataset and performance of the test was compared for each model.

Keywords: Goodness-of-fit test, marginal quasi-likelihood, multilevel modelling, type-I error, penalized quasi-likelihood, power, quasi-likelihood.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 731
3715 Developing a Viral Artifact to Improve Employees’ Security Behavior

Authors: Stefan Bauer, Josef Frysak

Abstract:

According to the scientific information management literature, the improper use of information technology (e.g. personal computers) by employees are one main cause for operational and information security loss events. Therefore, organizations implement information security awareness programs to increase employees’ awareness to further prevention of loss events. However, in many cases these information security awareness programs consist of conventional delivery methods like posters, leaflets, or internal messages to make employees aware of information security policies. We assume that a viral information security awareness video might be more effective medium than conventional methods commonly used by organizations. The purpose of this research is to develop a viral video artifact to improve employee security behavior concerning information technology.

Keywords: Information Security Awareness, Delivery Methods, Viral Videos, Employee Security Behavior.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1801
3714 Compressive Strength Evaluation of Underwater Concrete Structures Integrating the Combination of Rebound Hardness and Ultrasonic Pulse Velocity Methods with Artificial Neural Networks

Authors: Seunghee Park, Junkyeong Kim, Eun-Seok Shin, Sang-Hun Han

Abstract:

In this study, two kinds of nondestructive evaluation  (NDE) techniques (rebound hardness and ultrasonic pulse velocity  methods) are investigated for the effective maintenance of underwater  concrete structures. A new methodology to estimate the underwater  concrete strengths more effectively, named “artificial neural network  (ANN) – based concrete strength estimation with the combination of  rebound hardness and ultrasonic pulse velocity methods” is proposed  and verified throughout a series of experimental works.

 

Keywords: Underwater Concrete, Rebound Hardness, Schmidt hammer, Ultrasonic Pulse Velocity, Ultrasonic Sensor, Artificial Neural Networks, ANN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3657
3713 Simple Procedure for Probability Calculation of Tensile Crack Occurring in Rigid Pavement – Case Study

Authors: Aleš Florian, Lenka Ševelová, Jaroslav Žák

Abstract:

Formation of tensile cracks in concrete slabs of rigid pavement can be (among others) the initiation point of the other, more serious failures which can ultimately lead to complete degradation of the concrete slab and thus the whole pavement. Two measures can be used for reliability assessment of this phenomenon - the probability of failure and/or the reliability index. Different methods can be used for their calculation. The simple ones are called moment methods and simulation techniques. Two methods - FOSM Method and Simple Random Sampling Method - are verified and their comparison is performed. The influence of information about the probability distribution and the statistical parameters of input variables as well as of the limit state function on the calculated reliability index and failure probability are studied in three points on the lower surface of concrete slabs of the older type of rigid pavement formerly used in the Czech Republic.

Keywords: Failure, pavement, probability, reliability index, simulation, tensile crack.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2302
3712 Comparison of Detrending Methods in Spectral Analysis of Heart Rate Variability

Authors: Liping Li, Changchun Liu, Ke Li, Chengyu Liu

Abstract:

Non-stationary trend in R-R interval series is considered as a main factor that could highly influence the evaluation of spectral analysis. It is suggested to remove trends in order to obtain reliable results. In this study, three detrending methods, the smoothness prior approach, the wavelet and the empirical mode decomposition, were compared on artificial R-R interval series with four types of simulated trends. The Lomb-Scargle periodogram was used for spectral analysis of R-R interval series. Results indicated that the wavelet method showed a better overall performance than the other two methods, and more time-saving, too. Therefore it was selected for spectral analysis of real R-R interval series of thirty-seven healthy subjects. Significant decreases (19.94±5.87% in the low frequency band and 18.97±5.78% in the ratio (p<0.001)) were found. Thus the wavelet method is recommended as an optimal choice for use.

Keywords: empirical mode decomposition, heart rate variability, signal detrending, smoothness priors, wavelet

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2065