Search results for: Agile methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4046

Search results for: Agile methods

3596 Classic and Heuristic Approaches in Robot Motion Planning A Chronological Review

Authors: Ellips Masehian, Davoud Sedighizadeh

Abstract:

This paper reviews the major contributions to the Motion Planning (MP) field throughout a 35-year period, from classic approaches to heuristic algorithms. Due to the NP-Hardness of the MP problem, heuristic methods have outperformed the classic approaches and have gained wide popularity. After surveying around 1400 papers in the field, the amount of existing works for each method is identified and classified. Especially, the history and applications of numerous heuristic methods in MP is investigated. The paper concludes with comparative tables and graphs demonstrating the frequency of each MP method's application, and so can be used as a guideline for MP researchers.

Keywords: Robot motion planning, Heuristic algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5188
3595 Increase Energy Savings with Lighting Automation Using Light Pipes and Power LEDs

Authors: İ. Kıyak, G. Gökmen

Abstract:

Using of natural lighting has come into prominence in constructed buildings, especially in last ten years, under scope of energy efficiency. Natural lighting methods are one of the methods that aim to take advantage of day light in maximum level and decrease using of artificial lighting. Increasing of day light amount in buildings by using suitable methods will give optimum result in terms of comfort and energy saving when the daylight-artificial light integration is ensured with a suitable control system. Using of natural light in places that require lighting will ensure energy saving in great extent. With this study, it is aimed to save energy used for purpose of lighting. Under this scope, lighting of a scanning laboratory of a hospital was realized by using a lighting automation containing natural and artificial lighting. In natural lighting, light pipes were used and in artificial lighting, dimmable power LED modules were used. Necessity of lighting was followed with motion sensors. The lighting automation containing natural and artificial light was ensured with fuzzy logic control. At the scanning laboratory where this application was realized, energy saving in lighting was obtained.

Keywords: Daylight transfer, fuzzy logic controller, light pipe, Power LED.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2153
3594 Solid Waste Management through Mushroom Cultivation – An Eco Friendly Approach

Authors: Mary Josephine

Abstract:

Waste of certain process can be the input source of  other sectors in order to reduce environmental pollution. Today there  are more and more solid wastes are generated, but only very small  amount of those are recycled. So, the threatening of environmental  pressure to public health is very serious. The methods considered for  the treatment of solid waste are biogas tanks or processing to make  animal feed and fertilizer, however, they did not perform well. An  alternative approach is growing mushrooms on waste residues. This  is regarded as an environmental friendly solution with potential  economical benefit. The substrate producers do their best to produce  quality substrate at low cost. Apart from other methods, this can be  achieved by employing biologically degradable wastes used as the  resource material component of the substrate. Mushroom growing is  a significant tool for the restoration, replenishment and remediation  of Earth’s overburdened ecosphere. One of the rational methods of  waste utilization involves locally available wastes. The present study  aims to find out the yield of mushroom grown on locally available  waste for free and to conserve our environment by recycling wastes.

 

Keywords: Biodegradable, environment, mushroom, remediation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5312
3593 Comparison of Evolutionary Algorithms and their Hybrids Applied to MarioAI

Authors: Hidehiko Okada, Yuki Fujii

Abstract:

Researchers have been applying artificial/ computational intelligence (AI/CI) methods to computer games. In this research field, further researchesare required to compare AI/CI methods with respect to each game application. In thispaper, we report our experimental result on the comparison of evolution strategy, genetic algorithm and their hybrids, applied to evolving controller agents for MarioAI. GA revealed its advantage in our experiment, whereas the expected ability of ES in exploiting (fine-tuning) solutions was not clearly observed. The blend crossover operator and the mutation operator of GA might contribute well to explore the vast search space.

Keywords: Evolutionary algorithm, autonomous game controller agent, neuroevolutions, MarioAI

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1723
3592 Auditing of Building Information Modeling Application in Decoration Engineering Projects in China

Authors: Lan Luo

Abstract:

In China’s construction industry, it is a normal practice to separately subcontract the decoration engineering part from construction engineering, and Building Information Modeling (BIM) is also done separately. Application of BIM in decoration engineering should be integrated with other disciplines, but Chinese current practice makes this very difficult and complicated. Currently, there are three barriers in the auditing of BIM application in decoration engineering in China: heavy workload; scarcity of qualified professionals; and lack of literature concerning audit contents, standards, and methods. Therefore, it is significant to perform research on what (contents) should be evaluated, in which phase, and by whom (professional qualifications) in BIM application in decoration construction so that the application of BIM can be promoted in a better manner. Based on this consideration, four principles of BIM auditing are proposed: Comprehensiveness of information, accuracy of data, aesthetic attractiveness of appearance, and scheme optimization. In the model audit, three methods should be used: Collision, observation, and contrast. In addition, BIM auditing at six stages is discussed and a checklist for work items and results to be submitted is proposed. This checklist can be used for reference by decoration project participants.

Keywords: Audit, evaluation, dimensions, methods, standards, building information modeling application, decoration engineering projects.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1867
3591 Simple and Advanced Models for Calculating Single-Phase Diode Rectifier Line-Side Harmonics

Authors: Hussein A. Kazem, Abdulhakeem Abdullah Albaloshi, Ali Said Ali Al-Jabri, Khamis Humaid AlSaidi

Abstract:

This paper proposes different methods for estimation of the harmonic currents of the single-phase diode bridge rectifier. Both simple and advanced methods are compared and the models are put into a context of practical use for calculating the harmonic distortion in a typical application. Finally, the different models are compared to measurements of a real application and convincing results are achieved.

Keywords: Single-phase rectifier, line side Harmonics

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4636
3590 Age–Related Changes of the Sella Turcica Morphometry in Adults Older Than 20-25 Years

Authors: Yu. I. Pigolkin, M. A. Garcia Corro

Abstract:

Age determination of unknown dead bodies in forensic personal identification is a complicated process which involves the application of numerous methods and techniques. Skeletal remains are less exposed to influences of environmental factors. In order to enhance the accuracy of forensic age estimation additional properties of bones correlating with age are required to be revealed. Material and Methods: Dimensional examination of the sella turcica was carried out on cadavers with the cranium opened by a circular vibrating saw. The sample consisted of a total of 90 Russian subjects, ranging in age from two months and 87 years. Results: The tendency of dimensional variations throughout life was detected. There were no observed gender differences in the morphometry of the sella turcica. The shared use of the sella turcica depth and length values revealed the possibility to categorize an examined sample in a certain age period. Conclusions: Based on the results of existing methods of age determination, the morphometry of the sella turcica can be an additional characteristic, amplifying the received values, and accordingly, increasing the accuracy of forensic biological age diagnosis.

Keywords: Age–related changes in bone structures, forensic personal identification, Sella turcica morphometry, body identification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1372
3589 Dynamically Monitoring Production Methods for Identifying Structural Changes relevant to Logistics

Authors: Marco Kennemann, Steffen C. Eickemeyer, Peter Nyhuis

Abstract:

Due to the growing dynamic and complexity within the market environment production enterprises in particular are faced with new logistic challenges. Moreover, it is here in this dynamic environment that the Logistic Operating Curve Theory also reaches its limits as a method for describing the correlations between the logistic objectives. In order to convert this theory into a method for dynamically monitoring productions this paper will introduce methods for reliably and quickly identifying structural changes relevant to logistics.

Keywords: Dynamics, Logistic Operating Curves, Production Logistics, Production Planning and Control

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1385
3588 Net Fee and Commission Income Determinants of European Cooperative Banks

Authors: Karolína Vozková, Matěj Kuc

Abstract:

Net fee and commission income is one of the key elements of a bank’s core income. In the current low-interest rate environment, this type of income is gaining importance relative to net interest income. This paper analyses the effects of bank and country specific determinants of net fee and commission income on a set of cooperative banks from European countries in the 2007-2014 period. In order to do that, dynamic panel data methods (system Generalized Methods of Moments) were employed. Subsequently, alternative panel data methods were run as robustness checks of the analysis. Strong positive impact of bank concentration on the share of net fee and commission income was found, which proves that cooperative banks tend to display a higher share of fee income in less competitive markets. This is probably connected with the fact that they stick with their traditional deposit-taking and loan-providing model and fees on these services are driven down by the competitors. Moreover, compared to commercial banks, cooperatives do not expand heavily into non-traditional fee bearing services under competition and their overall fee income share is therefore decreasing with the increased competitiveness of the sector.

Keywords: Cooperative banking, dynamic panel data models, net fee, commission income, system GMM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2943
3587 Data Mining in Medicine Domain Using Decision Trees and Vector Support Machine

Authors: Djamila Benhaddouche, Abdelkader Benyettou

Abstract:

In this paper, we used data mining to extract biomedical knowledge. In general, complex biomedical data collected in studies of populations are treated by statistical methods, although they are robust, they are not sufficient in themselves to harness the potential wealth of data. For that you used in step two learning algorithms: the Decision Trees and Support Vector Machine (SVM). These supervised classification methods are used to make the diagnosis of thyroid disease. In this context, we propose to promote the study and use of symbolic data mining techniques.

Keywords: A classifier, Algorithms decision tree, knowledge extraction, Support Vector Machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1870
3586 Modern Trends in Foreign Direct Investments in Georgia

Authors: Rusudan Kinkladze, Guguli Kurashvili, Ketevan Chitaladze

Abstract:

Foreign direct investment is a driving force in the development of the interdependent national economies, and the study and analysis of investments is an urgent problem. It is particularly important for transitional economies, such as Georgia, and the study and analysis of investments is an urgent problem. Consequently, the goal of the research is the study and analysis of direct foreign investments in Georgia, and identification and forecasting of modern trends, and covers the period of 2006-2015. The study uses the methods of statistical observation, grouping and analysis, the methods of analytical indicators of time series, trend identification and the predicted values are calculated, as well as various literary and Internet sources relevant to the research. The findings showed that modern investment policy In Georgia is favorable for domestic as well as foreign investors. Georgia is still a net importer of investments. In 2015, the top 10 investing countries was led by Azerbaijan, United Kingdom and Netherlands, and the largest share of FDIs were allocated in the transport and communication sector; the financial sector was the second, followed by the health and social work sector, and the same trend will continue in the future. 

Keywords: Foreign Direct Investments, methods, statistics, analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 919
3585 Predication Model for Leukemia Diseases Based on Data Mining Classification Algorithms with Best Accuracy

Authors: Fahd Sabry Esmail, M. Badr Senousy, Mohamed Ragaie

Abstract:

In recent years, there has been an explosion in the rate of using technology that help discovering the diseases. For example, DNA microarrays allow us for the first time to obtain a "global" view of the cell. It has great potential to provide accurate medical diagnosis, to help in finding the right treatment and cure for many diseases. Various classification algorithms can be applied on such micro-array datasets to devise methods that can predict the occurrence of Leukemia disease. In this study, we compared the classification accuracy and response time among eleven decision tree methods and six rule classifier methods using five performance criteria. The experiment results show that the performance of Random Tree is producing better result. Also it takes lowest time to build model in tree classifier. The classification rules algorithms such as nearest- neighbor-like algorithm (NNge) is the best algorithm due to the high accuracy and it takes lowest time to build model in classification.

Keywords: Data mining, classification techniques, decision tree, classification rule, leukemia diseases, microarray data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2557
3584 Dimensionality Reduction of PSSM Matrix and its Influence on Secondary Structure and Relative Solvent Accessibility Predictions

Authors: Rafał Adamczak

Abstract:

State-of-the-art methods for secondary structure (Porter, Psi-PRED, SAM-T99sec, Sable) and solvent accessibility (Sable, ACCpro) predictions use evolutionary profiles represented by the position specific scoring matrix (PSSM). It has been demonstrated that evolutionary profiles are the most important features in the feature space for these predictions. Unfortunately applying PSSM matrix leads to high dimensional feature spaces that may create problems with parameter optimization and generalization. Several recently published suggested that applying feature extraction for the PSSM matrix may result in improvements in secondary structure predictions. However, none of the top performing methods considered here utilizes dimensionality reduction to improve generalization. In the present study, we used simple and fast methods for features selection (t-statistics, information gain) that allow us to decrease the dimensionality of PSSM matrix by 75% and improve generalization in the case of secondary structure prediction compared to the Sable server.

Keywords: Secondary structure prediction, feature selection, position specific scoring matrix.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1936
3583 Removing Ocular Artifacts from EEG Signals using Adaptive Filtering and ARMAX Modeling

Authors: Parisa Shooshtari, Gelareh Mohamadi, Behnam Molaee Ardekani, Mohammad Bagher Shamsollahi

Abstract:

EEG signal is one of the oldest measures of brain activity that has been used vastly for clinical diagnoses and biomedical researches. However, EEG signals are highly contaminated with various artifacts, both from the subject and from equipment interferences. Among these various kinds of artifacts, ocular noise is the most important one. Since many applications such as BCI require online and real-time processing of EEG signal, it is ideal if the removal of artifacts is performed in an online fashion. Recently, some methods for online ocular artifact removing have been proposed. One of these methods is ARMAX modeling of EEG signal. This method assumes that the recorded EEG signal is a combination of EOG artifacts and the background EEG. Then the background EEG is estimated via estimation of ARMAX parameters. The other recently proposed method is based on adaptive filtering. This method uses EOG signal as the reference input and subtracts EOG artifacts from recorded EEG signals. In this paper we investigate the efficiency of each method for removing of EOG artifacts. A comparison is made between these two methods. Our undertaken conclusion from this comparison is that adaptive filtering method has better results compared with the results achieved by ARMAX modeling.

Keywords: Ocular Artifacts, EEG, Adaptive Filtering, ARMAX

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1903
3582 Mathematical Reconstruction of an Object Image Using X-Ray Interferometric Fourier Holography Method

Authors: M. K. Balyan

Abstract:

The main principles of X-ray Fourier interferometric holography method are discussed. The object image is reconstructed by the mathematical method of Fourier transformation. The three methods are presented – method of approximation, iteration method and step by step method. As an example the complex amplitude transmission coefficient reconstruction of a beryllium wire is considered. The results reconstructed by three presented methods are compared. The best results are obtained by means of step by step method.

Keywords: Dynamical diffraction, hologram, object image, X-ray holography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1426
3581 Towards an Intelligent Ontology Construction Cost Estimation System: Using BIM and New Rules of Measurement Techniques

Authors: F. H. Abanda, B. Kamsu-Foguem, J. H. M. Tah

Abstract:

Construction cost estimation is one of the most important aspects of construction project design. For generations, the process of cost estimating has been manual, time-consuming and error-prone. This has partly led to most cost estimates to be unclear and riddled with inaccuracies that at times lead to over- or underestimation of construction cost. The development of standard set of measurement rules that are understandable by all those involved in a construction project, have not totally solved the challenges. Emerging Building Information Modelling (BIM) technologies can exploit standard measurement methods to automate cost estimation process and improve accuracies. This requires standard measurement methods to be structured in ontological and machine readable format; so that BIM software packages can easily read them. Most standard measurement methods are still text-based in textbooks and require manual editing into tables or Spreadsheet during cost estimation. The aim of this study is to explore the development of an ontology based on New Rules of Measurement (NRM) commonly used in the UK for cost estimation. The methodology adopted is Methontology, one of the most widely used ontology engineering methodologies. The challenges in this exploratory study are also reported and recommendations for future studies proposed.

Keywords: BIM, Construction projects, Cost estimation, NRM, Ontology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4442
3580 Impacts of E-learning in Nursing Education: In the Light of Recent Studies

Authors: A.Ö. İlkay, C.O. Zeynep

Abstract:

Information and Communication Technologies (ICT) has changed our life and learn. ICT bares doors to new innovative methods to deliver education. E-learning is a part of ICT and has been endorsed as a tool for developing “21st century skills” in higher education. The aim of this review is to establish the impacts of e-learning in undergraduate nursing education. A systematic literature review was conducted to assess the impacts of e-learning in nursing education by using Akdeniz University electronic databases. According to results, we can decelerate that the nursing faculties cannot treat e-learning methods as a single tool. E-learning should be used with a good understanding of learners’ needs.

Keywords: E-learning, nursing education, systematic literature review.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4631
3579 Attribute Weighted Class Complexity: A New Metric for Measuring Cognitive Complexity of OO Systems

Authors: Dr. L. Arockiam, A. Aloysius

Abstract:

In general, class complexity is measured based on any one of these factors such as Line of Codes (LOC), Functional points (FP), Number of Methods (NOM), Number of Attributes (NOA) and so on. There are several new techniques, methods and metrics with the different factors that are to be developed by the researchers for calculating the complexity of the class in Object Oriented (OO) software. Earlier, Arockiam et.al has proposed a new complexity measure namely Extended Weighted Class Complexity (EWCC) which is an extension of Weighted Class Complexity which is proposed by Mishra et.al. EWCC is the sum of cognitive weights of attributes and methods of the class and that of the classes derived. In EWCC, a cognitive weight of each attribute is considered to be 1. The main problem in EWCC metric is that, every attribute holds the same value but in general, cognitive load in understanding the different types of attributes cannot be the same. So here, we are proposing a new metric namely Attribute Weighted Class Complexity (AWCC). In AWCC, the cognitive weights have to be assigned for the attributes which are derived from the effort needed to understand their data types. The proposed metric has been proved to be a better measure of complexity of class with attributes through the case studies and experiments

Keywords: Software Complexity, Attribute Weighted Class Complexity, Weighted Class Complexity, Data Type

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2121
3578 Advanced Neural Network Learning Applied to Pulping Modeling

Authors: Z. Zainuddin, W. D. Wan Rosli, R. Lanouette, S. Sathasivam

Abstract:

This paper reports work done to improve the modeling of complex processes when only small experimental data sets are available. Neural networks are used to capture the nonlinear underlying phenomena contained in the data set and to partly eliminate the burden of having to specify completely the structure of the model. Two different types of neural networks were used for the application of pulping problem. A three layer feed forward neural networks, using the Preconditioned Conjugate Gradient (PCG) methods were used in this investigation. Preconditioning is a method to improve convergence by lowering the condition number and increasing the eigenvalues clustering. The idea is to solve the modified odified problem M-1 Ax= M-1b where M is a positive-definite preconditioner that is closely related to A. We mainly focused on Preconditioned Conjugate Gradient- based training methods which originated from optimization theory, namely Preconditioned Conjugate Gradient with Fletcher-Reeves Update (PCGF), Preconditioned Conjugate Gradient with Polak-Ribiere Update (PCGP) and Preconditioned Conjugate Gradient with Powell-Beale Restarts (PCGB). The behavior of the PCG methods in the simulations proved to be robust against phenomenon such as oscillations due to large step size.

Keywords: Convergence, pulping modeling, neural networks, preconditioned conjugate gradient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1408
3577 Evolution of Performance Measurement Methods in Conditions of Uncertainty: The Implementation of Fuzzy Sets in Performance Measurement

Authors: E. A. Tkachenko, E. M. Rogova, V. V. Klimov

Abstract:

One of the basic issues of development management is connected with performance measurement as a prerequisite for identifying the achievement of development objectives. The aim of our research is to develop an improved model of assessing a company’s development results. The model should take into account the cyclical nature of development and the high degree of uncertainty in dealing with numerous management tasks. Our hypotheses may be formulated as follows: Hypothesis 1. The cycle of a company’s development may be studied from the standpoint of a project cycle. To do that, methods and tools of project analysis are to be used. Hypothesis 2. The problem of the uncertainty when justifying managerial decisions within the framework of a company’s development cycle can be solved through the use of the mathematical apparatus of fuzzy logic. The reasoned justification of the validity of the hypotheses made is given in the suggested article. The fuzzy logic toolkit applies to the case of technology shift within an enterprise. It is proven that some restrictions in performance measurement that are incurred to conventional methods could be eliminated by implementation of the fuzzy logic apparatus in performance measurement models.

Keywords: Fuzzy logic, fuzzy sets, performance measurement, project analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1078
3576 Integrated Grey Rational Analysis-Standard Deviation Method for Handover in Heterogeneous Networks

Authors: Mohanad Alhabo, Naveed Nawaz, Mahmoud Al-Faris

Abstract:

The dense deployment of small cells is a promising solution to enhance the coverage and capacity of the heterogeneous networks (HetNets). However, the unplanned deployment could bring new challenges to the network ranging from interference, unnecessary handovers and handover failures. This will cause a degradation in the quality of service (QoS) delivered to the end user. In this paper, we propose an integrated Grey Rational Analysis Standard Deviation based handover method (GRA-SD) for HetNet. The proposed method integrates the Standard Deviation (SD) technique to acquire the weight of the handover metrics and the GRA method to select the best handover base station. The performance of the GRA-SD method is evaluated and compared with the traditional Multiple Attribute Decision Making (MADM) methods including Simple Additive Weighting (SAW) and VIKOR methods. Results reveal that the proposed method has outperformed the other methods in terms of minimizing the number of frequent unnecessary handovers and handover failures, in addition to improving the energy efficiency.

Keywords: Energy efficiency, handover, HetNets, MADM, small cells.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 496
3575 Numerical Modelling of Crack Initiation around a Wellbore Due to Explosion

Authors: Meysam Lak, Mohammad Fatehi Marji, Alireza Yarahamdi Bafghi, Abolfazl Abdollahipour

Abstract:

A wellbore is a hole that is drilled to aid in the exploration and recovery of natural resources including oil and gas. Occasionally, in order to increase productivity index and porosity of the wellbore and reservoir, the well stimulation methods have been used. Hydraulic fracturing is one of these methods. Moreover, several explosions at the end of the well can stimulate the reservoir and create fractures around it. In this study, crack initiation in rock around the wellbore has been numerically modeled due to explosion. One, two, three, and four pairs of explosion have been set at the end of the wellbore on its wall. After each stage of the explosion, results have been presented and discussed. Results show that this method can initiate and probably propagate several fractures around the wellbore.

Keywords: Crack initiation, explosion, finite difference modelling, well productivity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 810
3574 Comparative Study of QRS Complex Detection in ECG

Authors: Ibtihel Nouira, Asma Ben Abdallah, Ibtissem Kouaja, Mohamed Hèdi Bedoui

Abstract:

The processing of the electrocardiogram (ECG) signal consists essentially in the detection of the characteristic points of signal which are an important tool in the diagnosis of heart diseases. The most suitable are the detection of R waves. In this paper, we present various mathematical tools used for filtering ECG using digital filtering and Discreet Wavelet Transform (DWT) filtering. In addition, this paper will include two main R peak detection methods by applying a windowing process: The first method is based on calculations derived, the second is a time-frequency method based on Dyadic Wavelet Transform DyWT.

Keywords: Derived calculation methods, Electrocardiogram, R peaks, Wavelet Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2570
3573 Application Reliability Method for Concrete Dams

Authors: Mustapha Kamel Mihoubi, Mohamed Essadik Kerkar

Abstract:

Probabilistic risk analysis models are used to provide a better understanding of the reliability and structural failure of works, including when calculating the stability of large structures to a major risk in the event of an accident or breakdown. This work is interested in the study of the probability of failure of concrete dams through the application of reliability analysis methods including the methods used in engineering. It is in our case, the use of level 2 methods via the study limit state. Hence, the probability of product failures is estimated by analytical methods of the type first order risk method (FORM) and the second order risk method (SORM). By way of comparison, a level three method was used which generates a full analysis of the problem and involves an integration of the probability density function of random variables extended to the field of security using the Monte Carlo simulation method. Taking into account the change in stress following load combinations: normal, exceptional and extreme acting on the dam, calculation of the results obtained have provided acceptable failure probability values which largely corroborate the theory, in fact, the probability of failure tends to increase with increasing load intensities, thus causing a significant decrease in strength, shear forces then induce a shift that threatens the reliability of the structure by intolerable values of the probability of product failures. Especially, in case the increase of uplift in a hypothetical default of the drainage system.

Keywords: Dam, failure, limit-state, Monte Carlo simulation, reliability, probability, simulation, sliding, Taylor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1225
3572 Determination of Measurement Uncertainty in Extracting of Forming Limit Diagrams

Authors: M. Mahboubkhah, H. Fayazfar

Abstract:

In this research, Forming Limit Diagrams for supertension sheet metals which are using in automobile industry have been obtained. The exerted strains to sheet metals have been measured with four different methods and the errors of each method have also been represented. These methods have been compared with together and the most efficient and economic way of extracting of the exerted strains to sheet metals has been introduced. In this paper total error and uncertainty of FLD extraction procedures have been derived. Determination of the measurement uncertainty in extracting of FLD has a great importance in design and analysis of the sheet metal forming process.

Keywords: Forming Limit Diagram, Major and Minor Strain, Measurement Uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2002
3571 Estimation of Real Power Transfer Allocation Using Intelligent Systems

Authors: H. Shareef, A. Mohamed, S. A. Khalid, Aziah Khamis

Abstract:

This paper presents application artificial intelligent (AI) techniques, namely artificial neural network (ANN), adaptive neuro fuzzy interface system (ANFIS), to estimate the real power transfer between generators and loads. Since these AI techniques adopt supervised learning, it first uses modified nodal equation method (MNE) to determine real power contribution from each generator to loads. Then the results of MNE method and load flow information are utilized to estimate the power transfer using AI techniques. The 25-bus equivalent system of south Malaysia is utilized as a test system to illustrate the effectiveness of both AI methods compared to that of the MNE method. The mean squared error of the estimate of ANN and ANFIS power transfer allocation methods are 1.19E-05 and 2.97E-05, respectively. Furthermore, when compared to MNE method, ANN and ANFIS methods computes generator contribution to loads within 20.99 and 39.37msec respectively whereas the MNE method took 360msec for the calculation of same real power transfer allocation. 

Keywords: Artificial intelligence, Power tracing, Artificial neural network, ANFIS, Power system deregulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2583
3570 A Comparison of Some Thresholding Selection Methods for Wavelet Regression

Authors: Alsaidi M. Altaher, Mohd T. Ismail

Abstract:

In wavelet regression, choosing threshold value is a crucial issue. A too large value cuts too many coefficients resulting in over smoothing. Conversely, a too small threshold value allows many coefficients to be included in reconstruction, giving a wiggly estimate which result in under smoothing. However, the proper choice of threshold can be considered as a careful balance of these principles. This paper gives a very brief introduction to some thresholding selection methods. These methods include: Universal, Sure, Ebays, Two fold cross validation and level dependent cross validation. A simulation study on a variety of sample sizes, test functions, signal-to-noise ratios is conducted to compare their numerical performances using three different noise structures. For Gaussian noise, EBayes outperforms in all cases for all used functions while Two fold cross validation provides the best results in the case of long tail noise. For large values of signal-to-noise ratios, level dependent cross validation works well under correlated noises case. As expected, increasing both sample size and level of signal to noise ratio, increases estimation efficiency.

Keywords: wavelet regression, simulation, Threshold.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1767
3569 How Virtualization, Decentralization and Network Building Change the Manufacturing Landscape: An Industry 4.0 Perspective

Authors: Malte Brettel, Niklas Friederichsen, Michael Keller, Marius Rosenberg

Abstract:

The German manufacturing industry has to withstand an increasing global competition on product quality and production costs. As labor costs are high, several industries have suffered severely under the relocation of production facilities towards aspiring countries, which have managed to close the productivity and quality gap substantially. Established manufacturing companies have recognized that customers are not willing to pay large price premiums for incremental quality improvements. As a consequence, many companies from the German manufacturing industry adjust their production focusing on customized products and fast time to market. Leveraging the advantages of novel production strategies such as Agile Manufacturing and Mass Customization, manufacturing companies transform into integrated networks, in which companies unite their core competencies. Hereby, virtualization of the process- and supply-chain ensures smooth inter-company operations providing real-time access to relevant product and production information for all participating entities. Boundaries of companies deteriorate, as autonomous systems exchange data, gained by embedded systems throughout the entire value chain. By including Cyber-Physical-Systems, advanced communication between machines is tantamount to their dialogue with humans. The increasing utilization of information and communication technology allows digital engineering of products and production processes alike. Modular simulation and modeling techniques allow decentralized units to flexibly alter products and thereby enable rapid product innovation. The present article describes the developments of Industry 4.0 within the literature and reviews the associated research streams. Hereby, we analyze eight scientific journals with regards to the following research fields: Individualized production, end-to-end engineering in a virtual process chain and production networks. We employ cluster analysis to assign sub-topics into the respective research field. To assess the practical implications, we conducted face-to-face interviews with managers from the industry as well as from the consulting business using a structured interview guideline. The results reveal reasons for the adaption and refusal of Industry 4.0 practices from a managerial point of view. Our findings contribute to the upcoming research stream of Industry 4.0 and support decision-makers to assess their need for transformation towards Industry 4.0 practices. 

Keywords: Industry 4.0., Mass Customization, Production networks, Virtual Process-Chain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31737
3568 An Estimating Parameter of the Mean in Normal Distribution by Maximum Likelihood, Bayes, and Markov Chain Monte Carlo Methods

Authors: Autcha Araveeporn

Abstract:

This paper is to compare the parameter estimation of the mean in normal distribution by Maximum Likelihood (ML), Bayes, and Markov Chain Monte Carlo (MCMC) methods. The ML estimator is estimated by the average of data, the Bayes method is considered from the prior distribution to estimate Bayes estimator, and MCMC estimator is approximated by Gibbs sampling from posterior distribution. These methods are also to estimate a parameter then the hypothesis testing is used to check a robustness of the estimators. Data are simulated from normal distribution with the true parameter of mean 2, and variance 4, 9, and 16 when the sample sizes is set as 10, 20, 30, and 50. From the results, it can be seen that the estimation of MLE, and MCMC are perceivably different from the true parameter when the sample size is 10 and 20 with variance 16. Furthermore, the Bayes estimator is estimated from the prior distribution when mean is 1, and variance is 12 which showed the significant difference in mean with variance 9 at the sample size 10 and 20.

Keywords: Bayes method, Markov Chain Monte Carlo method, Maximum Likelihood method, normal distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1434
3567 Increased Capacity of Information Hiding in LSB-s Method for Text and Image

Authors: H.B.Kekre, Archana Athawale, Pallavi N.Halarnkar

Abstract:

Steganography, derived from Greek, literally means “covered writing". It includes a vast array of secret communications methods that conceal the message-s very existence. These methods include invisible inks, microdots, character arrangement, digital signatures, covert channels, and spread spectrum communications. This paper proposes a new improved version of Least Significant Bit (LSB) method. The approach proposed is simple for implementation when compared to Pixel value Differencing (PVD) method and yet achieves a High embedding capacity and imperceptibility. The proposed method can also be applied to 24 bit color images and achieve embedding capacity much higher than PVD.

Keywords: Information Hiding, LSB Matching, PVD Steganography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3166