Search results for: Equivalent Static Analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9287

Search results for: Equivalent Static Analysis

8147 An Elaborate Survey on Node Replication Attack in Static Wireless Sensor Networks

Authors: N. S. Usha, E. A. Mary Anita

Abstract:

Recent innovations in the field of technology led to the use of   wireless sensor networks in various applications, which consists of a number of small, very tiny, low-cost, non-tamper proof and resource constrained sensor nodes. These nodes are often distributed and deployed in an unattended environment, so as to collaborate with each other to share data or information. Amidst various applications, wireless sensor network finds a major role in monitoring battle field in military applications. As these non-tamperproof nodes are deployed in an unattended location, they are vulnerable to many security attacks. Amongst many security attacks, the node replication attack seems to be more threatening to the network users. Node Replication attack is caused by an attacker, who catches one true node, duplicates the first certification and cryptographic materials, makes at least one or more copies of the caught node and spots them at certain key positions in the system to screen or disturb the network operations. Preventing the occurrence of such node replication attacks in network is a challenging task. In this survey article, we provide the classification of detection schemes and also explore the various schemes proposed in each category. Also, we compare the various detection schemes against certain evaluation parameters and also its limitations. Finally, we provide some suggestions for carrying out future research work against such attacks.

Keywords: Clone node, data security, detection schemes, node replication attack, wireless sensor networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 790
8146 Biomethanation of Palm Oil Mill Effluent (POME) by Membrane Anaerobic System (MAS) using POME as a Substrate

Authors: N.H. Abdurahman, Y. M. Rosli, N. H. Azhari, S. F. Tam

Abstract:

The direct discharge of palm oil mill effluent (POME) wastewater causes serious environmental pollution due to its high chemical oxygen demand (COD) and biochemical oxygen demand (BOD). Traditional ways for POME treatment have both economical and environmental disadvantages. In this study, a membrane anaerobic system (MAS) was used as an alternative, cost effective method for treating POME. Six steady states were attained as a part of a kinetic study that considered concentration ranges of 8,220 to 15,400 mg/l for mixed liquor suspended solids (MLSS) and 6,329 to 13,244 mg/l for mixed liquor volatile suspended solids (MLVSS). Kinetic equations from Monod, Contois and Chen & Hashimoto were employed to describe the kinetics of POME treatment at organic loading rates ranging from 2 to 13 kg COD/m3/d. throughout the experiment, the removal efficiency of COD was from 94.8 to 96.5% with hydraulic retention time, HRT from 400.6 to 5.7 days. The growth yield coefficient, Y was found to be 0.62gVSS/g COD the specific microorganism decay rate was 0.21 d-1 and the methane gas yield production rate was between 0.25 l/g COD/d and 0.58 l/g COD/d. Steady state influent COD concentrations increased from 18,302 mg/l in the first steady state to 43,500 mg/l in the sixth steady state. The minimum solids retention time, which was obtained from the three kinetic models ranged from 5 to 12.3 days. The k values were in the range of 0.35 – 0.519 g COD/ g VSS • d and values were between 0.26 and 0.379 d-1. The solids retention time (SRT) decreased from 800 days to 11.6 days. The complete treatment reduced the COD content to 2279 mg/l equivalent to a reduction of 94.8% reduction from the original.

Keywords: COD reduction, POME, kinetics, membrane, anaerobic, monod, contois equation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2547
8145 Assessment of Analytical Equations for the Derivation of Young’s Modulus of Bonded Rubber Materials

Authors: Z. N. Haji, S. O. Oyadiji, H. Samami, O. Farrell

Abstract:

The prediction of the vibration response of rubber products by analytical or numerical method depends mainly on the predefined intrinsic material properties such as Young’s modulus, damping factor and Poisson’s ratio. Such intrinsic properties are determined experimentally by subjecting a bonded rubber sample to compression tests. The compression tests on such a sample yield an apparent Young’s modulus which is greater in magnitude than the intrinsic Young’s modulus of the rubber. As a result, many analytical equations have been developed to determine Young’s modulus from an apparent Young’s modulus of bonded rubber materials. In this work, the applicability of some of these analytical equations is assessed via experimental testing. The assessment is based on testing of vulcanized nitrile butadiene rubber (NBR70) samples using tensile test and compression test methods. The analytical equations are used to determine the intrinsic Young’s modulus from the apparent modulus that is derived from the compression test data of the bonded rubber samples. Then, these Young’s moduli are compared with the actual Young’s modulus that is derived from the tensile test data. The results show significant discrepancy between the Young’s modulus derived using the analytical equations and the actual Young’s modulus.

Keywords: Bonded rubber, quasi-static test, shape factor, apparent Young’s modulus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 720
8144 Combined Analysis of Sudoku Square Designs with Same Treatments

Authors: A. Danbaba

Abstract:

Several experiments are conducted at different environments such as locations or periods (seasons) with identical treatments to each experiment purposely to study the interaction between the treatments and environments or between the treatments and periods (seasons). The commonly used designs of experiments for this purpose are randomized block design, Latin square design, balanced incomplete block design, Youden design, and one or more factor designs. The interest is to carry out a combined analysis of the data from these multi-environment experiments, instead of analyzing each experiment separately. This paper proposed combined analysis of experiments conducted via Sudoku square design of odd order with same experimental treatments.

Keywords: Sudoku designs, combined analysis, multi-environment experiments, common treatments.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1523
8143 Average Current Estimation Technique for Reliability Analysis of Multiple Semiconductor Interconnects

Authors: Ki-Young Kim, Jae-Ho Lim, Deok-Min Kim, Seok-Yoon Kim

Abstract:

Average current analysis checking the impact of current flow is very important to guarantee the reliability of semiconductor systems. As semiconductor process technologies improve, the coupling capacitance often become bigger than self capacitances. In this paper, we propose an analytic technique for analyzing average current on interconnects in multi-conductor structures. The proposed technique has shown to yield the acceptable errors compared to HSPICE results while providing computational efficiency.

Keywords: current moment, interconnect modeling, reliability analysis, worst-case switching

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1371
8142 Analysis of Lead Time Delays in Supply Chain: A Case Study

Authors: Abdel-Aziz M. Mohamed, Nermeen Coutry

Abstract:

Lead time is a critical measure of a supply chain's performance. It impacts both the customer satisfactions as well as the total cost of inventory. This paper presents the result of a study on the analysis of the customer order lead-time for a multinational company. In the study, the lead time was divided into three stages respectively: order entry, order fulfillment, and order delivery. A sample of size 2,425 order lines was extracted from the company's records to use for this study. The sample data entails information regarding customer orders from the time of order entry until order delivery. Data regarding the lead time of each stage for different orders were also provided. Summary statistics on lead time data reveals that about 30% of the orders were delivered later than the scheduled due date. The result of the multiple linear regression analysis technique revealed that component type, logistics parameter, order size and the customer type have significant impacts on lead time. Data analysis on the stages of lead time indicates that stage 2 consumed over 50% of the lead time. Pareto analysis was made to study the reasons for the customer order delay in each stage. Recommendation was given to resolve the problem.

Keywords: Lead time reduction, customer satisfaction, service quality, statistical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6665
8141 Brand Placement Strategies in Turkey: The Case of “Yalan Dünya”

Authors: Burçe Boyraz

Abstract:

This study examines appearances of brand placement as an alternative communication strategy in television series by focusing on Yalan Dünya which is one of the most popular television series in Turkey. Consequently, this study has a descriptive research design and quantitative content analysis method is used in order to analyze frequency and time data of brand placement appearances in first 3 seasons of Yalan Dünya with 16 episodes. Analysis of brand placement practices in Yalan Dünya is dealt in three categories: episode-based analysis, season-based analysis and comparative analysis. At the end, brand placement practices in Yalan Dünya are evaluated in terms of type, form, duration and legal arrangements. As a result of this study, it is seen that brand placement plays a determinant role in Yalan Dünya content. Also, current legal arrangements make brand placement closer to other traditional communication strategies instead of differing brand placement from them distinctly.

Keywords: Advertising, Alternative communication strategy, Brand placement, Yalan Dünya.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4357
8140 Intellectual Capital and Competitive Advantage: An Analysis of the Biotechnology Industry

Authors: Campisi Domenico, Costa Roberta

Abstract:

Intellectual capital measurement is a central aspect of knowledge management. The measurement and the evaluation of intangible assets play a key role in allowing an effective management of these assets as sources of competitiveness. For these reasons, managers and practitioners need conceptual and analytical tools taking into account the unique characteristics and economic significance of Intellectual Capital. Following this lead, we propose an efficiency and productivity analysis of Intellectual Capital, as a determinant factor of the company competitive advantage. The analysis is carried out by means of Data Envelopment Analysis (DEA) and Malmquist Productivity Index (MPI). These techniques identify Bests Practice companies that have accomplished competitive advantage implementing successful strategies of Intellectual Capital management, and offer to inefficient companies development paths by means of benchmarking. The proposed methodology is employed on the Biotechnology industry in the period 2007-2010.

Keywords: Data Envelopment Analysis, Innovation, Intangible assets, Intellectual Capital, Malmquist Productivity Index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1907
8139 Enhanced Clustering Analysis and Visualization Using Kohonen's Self-Organizing Feature Map Networks

Authors: Kasthurirangan Gopalakrishnan, Siddhartha Khaitan, Anshu Manik

Abstract:

Cluster analysis is the name given to a diverse collection of techniques that can be used to classify objects (e.g. individuals, quadrats, species etc). While Kohonen's Self-Organizing Feature Map (SOFM) or Self-Organizing Map (SOM) networks have been successfully applied as a classification tool to various problem domains, including speech recognition, image data compression, image or character recognition, robot control and medical diagnosis, its potential as a robust substitute for clustering analysis remains relatively unresearched. SOM networks combine competitive learning with dimensionality reduction by smoothing the clusters with respect to an a priori grid and provide a powerful tool for data visualization. In this paper, SOM is used for creating a toroidal mapping of two-dimensional lattice to perform cluster analysis on results of a chemical analysis of wines produced in the same region in Italy but derived from three different cultivators, referred to as the “wine recognition data" located in the University of California-Irvine database. The results are encouraging and it is believed that SOM would make an appealing and powerful decision-support system tool for clustering tasks and for data visualization.

Keywords: Artificial neural networks, cluster analysis, Kohonen maps, wine recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2109
8138 Development of Logic Model for R&D Program Plan Analysis in Preliminary Feasibility Study

Authors: Hyun-Kyu Kang

Abstract:

The Korean Government has applied the preliminary feasibility study to new government R&D program plans as a part of an evaluation system for R&D programs. The preliminary feasibility study for the R&D program is composed of 3 major criteria such as technological, policy and economic analysis. The program logic model approach is used as a part of the technological analysis in the preliminary feasibility study. We has developed and improved the R&D program logic model. The logic model is a very useful tool for evaluating R&D program plans. Using a logic model, we can generally identify important factors of the R&D program plan, analyze its logic flow and find the disconnection or jump in the logic flow among components of the logic model.

Keywords: Preliminary feasibility study, R&D program logic model, technological analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2141
8137 An Efficient Tool for Mitigating Voltage Unbalance with Reactive Power Control of Distributed Grid-Connected Photovoltaic Systems

Authors: Malinwo Estone Ayikpa

Abstract:

With the rapid increase of grid-connected PV systems over the last decades, genuine challenges have arisen for engineers and professionals of energy field in the planning and operation of existing distribution networks with the integration of new generation sources. However, the conventional distribution network, in its design was not expected to receive other generation outside the main power supply. The tools generally used to analyze the networks become inefficient and cannot take into account all the constraints related to the operation of grid-connected PV systems. Some of these constraints are voltage control difficulty, reverse power flow, and especially voltage unbalance which could be due to the poor distribution of single-phase PV systems in the network. In order to analyze the impact of the connection of small and large number of PV systems to the distribution networks, this paper presents an efficient optimization tool that minimizes voltage unbalance in three-phase distribution networks with active and reactive power injections from the allocation of single-phase and three-phase PV plants. Reactive power can be generated or absorbed using the available capacity and the adjustable power factor of the inverter. Good reduction of voltage unbalance can be achieved by reactive power control of the PV systems. The presented tool is based on the three-phase current injection method and the PV systems are modeled via an equivalent circuit. The primal-dual interior point method is used to obtain the optimal operating points for the systems.

Keywords: Photovoltaic generation, primal-dual interior point method, three-phase optimal power flow, unbalanced system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1073
8136 Evaluation on Recent Committed Crypt Analysis Hash Function

Authors: A. Arul Lawrence Selvakumar, C. Suresh Ganandhas

Abstract:

This paper describes the study of cryptographic hash functions, one of the most important classes of primitives used in recent techniques in cryptography. The main aim is the development of recent crypt analysis hash function. We present different approaches to defining security properties more formally and present basic attack on hash function. We recall Merkle-Damgard security properties of iterated hash function. The Main aim of this paper is the development of recent techniques applicable to crypt Analysis hash function, mainly from SHA family. Recent proposed attacks an MD5 & SHA motivate a new hash function design. It is designed not only to have higher security but also to be faster than SHA-256. The performance of the new hash function is at least 30% better than that of SHA-256 in software. And it is secure against any known cryptographic attacks on hash functions.

Keywords: Crypt Analysis, cryptographic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1318
8135 Regional Analysis of Streamflow Drought: A Case Study for Southwestern Iran

Authors: M. Byzedi, B. Saghafian

Abstract:

Droughts are complex, natural hazards that, to a varying degree, affect some parts of the world every year. The range of drought impacts is related to drought occurring in different stages of the hydrological cycle and usually different types of droughts, such as meteorological, agricultural, hydrological, and socioeconomical are distinguished. Streamflow drought was analyzed by the method of truncation level (at 70% level) on daily discharges measured in 54 hydrometric stations in southwestern Iran. Frequency analysis was carried out for annual maximum series (AMS) of drought deficit volume and duration series. Some factors including physiographic, climatic, geologic, and vegetation cover were studied as influential factors in the regional analysis. According to the results of factor analysis, six most effective factors were identified as area, rainfall from December to February, the percent of area with Normalized Difference Vegetation Index (NDVI) <0.1, the percent of convex area, drainage density and the minimum of watershed elevation that explained 90.9% of variance. The homogenous regions were determined by cluster analysis and discriminate function analysis. Suitable multivariate regression models were evaluated for streamflow drought deficit volume with 2 years return period. The significance level of regression models was 0.01. The results showed that the watershed area is the most effective factor with high correlation with deficit volume. Also, drought duration was not a suitable drought index for regional analysis.

Keywords: Iran, Streamflow drought, truncation level method, regional analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1723
8134 Power System Contingency Analysis Using Multiagent Systems

Authors: Anant Oonsivilai, Kenedy A. Greyson

Abstract:

The demand of the energy management systems (EMS) set forth by modern power systems requires fast energy management systems. Contingency analysis is among the functions in EMS which is time consuming. In order to handle this limitation, this paper introduces agent based technology in the contingency analysis. The main function of agents is to speed up the performance. Negotiations process in decision making is explained and the issue set forth is the minimization of the operating costs. The IEEE 14 bus system and its line outage have been used in the research and simulation results are presented.

Keywords: Agents, model, negotiation, optimal dispatch, powersystems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2115
8133 The Development and Testing of a Small Scale Dry Electrostatic Precipitator for the Removal of Particulate Matter

Authors: Derek Wardle, Tarik Al-Shemmeri, Neil Packer

Abstract:

This paper presents a small tube/wire type electrostatic precipitator (ESP). In the ESPs present form, particle charging and collecting voltages and airflow rates were individually varied throughout 200 ambient temperature test runs ranging from 10 to 30 kV in increments on 5 kV and 0.5 m/s to 1.5 m/s, respectively. It was repeatedly observed that, at input air velocities of between 0.5 and 0.9 m/s and voltage settings of 20 kV to 30 kV, the collection efficiency remained above 95%. The outcomes of preliminary tests at combustion flue temperatures are, at present, inconclusive although indications are that there is little or no drop in comparable performance during ideal test conditions. A limited set of similar tests was carried out during which the collecting electrode was grounded, having been disconnected from the static generator. The collecting efficiency fell significantly, and for that reason, this approach was not pursued further. The collecting efficiencies during ambient temperature tests were determined by mass balance between incoming and outgoing dry PM. The efficiencies of combustion temperature runs are determined by analysing the difference in opacity of the flue gas at inlet and outlet compared to a reference light source. In addition, an array of Leit tabs (carbon coated, electrically conductive adhesive discs) was placed at inlet and outlet for a number of four-day continuous ambient temperature runs. Analysis of the discs’ contamination was carried out using scanning electron microscopy and ImageJ computer software that confirmed collection efficiencies of over 99% which gave unequivocal support to all the previous tests. The average efficiency for these runs was 99.409%. Emissions collected from a woody biomass combustion unit, classified to a diameter of 100 µm, were used in all ambient temperature trials test runs apart from two which collected airborne dust from within the laboratory. Sawdust and wood pellets were chosen for laboratory and field combustion trials. Video recordings were made of three ambient temperature test runs in which the smoke from a wood smoke generator was drawn through the precipitator. Although these runs were visual indicators only, with no objective other than to display, they provided a strong argument for the device’s claimed efficiency, as no emissions were visible at exit when energised.  The theoretical performance of ESPs, when applied to the geometry and configuration of the tested model, was compared to the actual performance and was shown to be in good agreement with it.

Keywords: Electrostatic precipitators, air quality, particulates emissions, electron microscopy, ImageJ.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1129
8132 An Advanced Time-Frequency Domain Method for PD Extraction with Non-Intrusive Measurement

Authors: Guomin Luo, Daming Zhang, Yong Kwee Koh, Kim Teck Ng, Helmi Kurniawan, Weng Hoe Leong

Abstract:

Partial discharge (PD) detection is an important method to evaluate the insulation condition of metal-clad apparatus. Non-intrusive sensors which are easy to install and have no interruptions on operation are preferred in onsite PD detection. However, it often lacks of accuracy due to the interferences in PD signals. In this paper a novel PD extraction method that uses frequency analysis and entropy based time-frequency (TF) analysis is introduced. The repetitive pulses from convertor are first removed via frequency analysis. Then, the relative entropy and relative peak-frequency of each pulse (i.e. time-indexed vector TF spectrum) are calculated and all pulses with similar parameters are grouped. According to the characteristics of non-intrusive sensor and the frequency distribution of PDs, the pulses of PD and interferences are separated. Finally the PD signal and interferences are recovered via inverse TF transform. The de-noised result of noisy PD data demonstrates that the combination of frequency and time-frequency techniques can discriminate PDs from interferences with various frequency distributions.

Keywords: Entropy, Fourier analysis, non-intrusive measurement, time-frequency analysis, partial discharge

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1571
8131 Identifying Relationships between Technology-based Services and ICTs: A Patent Analysis Approach

Authors: Chulhyun Kim, Seungkyum Kim, Moon-soo Kim

Abstract:

A variety of new technology-based services have emerged with the development of Information and Communication Technologies (ICTs). Since technology-based services have technology-driven characteristics, the identification of relationships between technology-based services and ICTs would give meaningful implications. Thus, this paper proposes an approach for identifying the relationships between technology-based services and ICTs by analyzing patent documents. First, business model (BM) patents are classified into relevant service categories. Second, patent citation analysis is conducted to investigate the technological linkage and impacts between technology-based services and ICTs at macro level. Third, as a micro level analysis, patent co-classification analysis is employed to identify the technological linkage and coverage. The proposed approach could guide and help managers and designers of technology-based services to discover the opportunity of the development of new technology-based services in emerging service sectors.

Keywords: Technology-based Services, Information and Communication Technology (ICT), Business Model (BM) Patent, Patent Analysis, Technological Relationship

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1966
8130 Time Series Forecasting Using Independent Component Analysis

Authors: Theodor D. Popescu

Abstract:

The paper presents a method for multivariate time series forecasting using Independent Component Analysis (ICA), as a preprocessing tool. The idea of this approach is to do the forecasting in the space of independent components (sources), and then to transform back the results to the original time series space. The forecasting can be done separately and with a different method for each component, depending on its time structure. The paper gives also a review of the main algorithms for independent component analysis in the case of instantaneous mixture models, using second and high-order statistics. The method has been applied in simulation to an artificial multivariate time series with five components, generated from three sources and a mixing matrix, randomly generated.

Keywords: Independent Component Analysis, second order statistics, simulation, time series forecasting

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1765
8129 Role of Association Rule Mining in Numerical Data Analysis

Authors: Sudhir Jagtap, Kodge B. G., Shinde G. N., Devshette P. M

Abstract:

Numerical analysis naturally finds applications in all fields of engineering and the physical sciences, but in the 21st century, the life sciences and even the arts have adopted elements of scientific computations. The numerical data analysis became key process in research and development of all the fields [6]. In this paper we have made an attempt to analyze the specified numerical patterns with reference to the association rule mining techniques with minimum confidence and minimum support mining criteria. The extracted rules and analyzed results are graphically demonstrated. Association rules are a simple but very useful form of data mining that describe the probabilistic co-occurrence of certain events within a database [7]. They were originally designed to analyze market-basket data, in which the likelihood of items being purchased together within the same transactions are analyzed.

Keywords: Numerical data analysis, Data Mining, Association Rule Mining

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2848
8128 Embedded Systems Energy Consumption Analysis Through Co-modelling and Simulation

Authors: José Antonio Esparza Isasa, Finn Overgaard Hansen, Peter Gorm Larsen

Abstract:

This paper presents a new methodology to study power and energy consumption in mechatronic systems early in the development process. This new approach makes use of two modeling languages to represent and simulate embedded control software and electromechanical subsystems in the discrete event and continuous time domain respectively within a single co-model. This co-model enables an accurate representation of power and energy consumption and facilitates the analysis and development of both software and electro-mechanical subsystems in parallel. This makes the engineers aware of energy-wise implications of different design alternatives and enables early trade-off analysis from the beginning of the analysis and design activities.

Keywords: Energy consumption, embedded systems, modeldriven engineering, power awareness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2059
8127 Wavelet-Based ECG Signal Analysis and Classification

Authors: Madina Hamiane, May Hashim Ali

Abstract:

This paper presents the processing and analysis of ECG signals. The study is based on wavelet transform and uses exclusively the MATLAB environment. This study includes removing Baseline wander and further de-noising through wavelet transform and metrics such as signal-to noise ratio (SNR), Peak signal-to-noise ratio (PSNR) and the mean squared error (MSE) are used to assess the efficiency of the de-noising techniques. Feature extraction is subsequently performed whereby signal features such as heart rate, rise and fall levels are extracted and the QRS complex was detected which helped in classifying the ECG signal. The classification is the last step in the analysis of the ECG signals and it is shown that these are successfully classified as Normal rhythm or Abnormal rhythm.  The final result proved the adequacy of using wavelet transform for the analysis of ECG signals.

Keywords: ECG Signal, QRS detection, thresholding, wavelet decomposition, feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1240
8126 A Failure Analysis Tool for HDD Analysis

Authors: C. Kumjeera, T. Unchim, B. Marungsri, A. Oonsivilai

Abstract:

The study of piezoelectric material in the past was in T-Domain form; however, no one has studied piezoelectric material in the S-Domain form. This paper will present the piezoelectric material in the transfer function or S-Domain model. S-Domain is a well known mathematical model, used for analyzing the stability of the material and determining the stability limits. By using S-Domain in testing stability of piezoelectric material, it will provide a new tool for the scientific world to study this material in various forms.

Keywords: Hard disk drive, failure analysis, tool, time

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2731
8125 The Analysis on Leadership Skills in UK Automobile Manufacturing Enterprises

Authors: Yanting Cao, Kazumitsu Nawata

Abstract:

The UK has strong economic growth, which attracts other countries to invest there through globalization. This research process will be based on quantitative and qualitative descriptive analysis using interviews. The secondary analysis will involve a case study approach to understand the important aspects of leadership skills. The research outcomes will be identifying the strength and weakness of the leadership skills of UK automobile manufacturing enterprises and suggest the best practices adopted by the respective countries for better results.

Keywords: engineering management, leadership, industrial project management, project managers, automobile manufacturing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1375
8124 A Sociocybernetics Data Analysis Using Causality in Tourism Networks

Authors: M. Lloret-Climent, J. Nescolarde-Selva

Abstract:

The aim of this paper is to propose a mathematical model to determine invariant sets, set covering, orbits and, in particular, attractors in the set of tourism variables. Analysis was carried out based on a pre-designed algorithm and applying our interpretation of chaos theory developed in the context of General Systems Theory. This article sets out the causal relationships associated with tourist flows in order to enable the formulation of appropriate strategies. Our results can be applied to numerous cases. For example, in the analysis of tourist flows, these findings can be used to determine whether the behaviour of certain groups affects that of other groups and to analyse tourist behaviour in terms of the most relevant variables. Unlike statistical analyses that merely provide information on current data, our method uses orbit analysis to forecast, if attractors are found, the behaviour of tourist variables in the immediate future.

Keywords: Attractor, invariant set, orbits, tourist variables.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1734
8123 Decision Support System for Hospital Selection in Emergency Medical Services: A Discrete Event Simulation Approach

Authors: D. Tedesco, G. Feletti, P. Trucco

Abstract:

The present study aims to develop a Decision Support System (DSS) to support operational decisions in Emergency Medical Service (EMS) systems regarding the assignment of medical emergency requests to Emergency Departments (ED). This problem is called “hospital selection” and concerns the definition of policies for the selection of the ED to which patients who require further treatment are transported by ambulance. The employed research methodology consists of a first phase of review of the technical-scientific literature concerning DSSs to support the EMS management and, in particular, the hospital selection decision. From the literature analysis, it emerged that current studies mainly focused on the EMS phases related to the ambulance service and consider a process that ends when the ambulance is available after completing a mission. Therefore, all the ED-related issues are excluded and considered as part of a separate process. Indeed, the most studied hospital selection policy turned out to be proximity, thus allowing to minimize the travelling time and to free-up the ambulance in the shortest possible time. The purpose of the present study consists in developing an optimization model for assigning medical emergency requests to the EDs also considering the expected time performance in the subsequent phases of the process, such as the case mix, the expected service throughput times, and the operational capacity of different EDs in hospitals. To this end, a Discrete Event Simulation (DES) model was created to compare different hospital selection policies. The model was implemented with the AnyLogic software and finally validated on a realistic case. The hospital selection policy that returned the best results was the minimization of the Time To Provider (TTP), considered as the time from the beginning of the ambulance journey to the ED at the beginning of the clinical evaluation by the doctor. Finally, two approaches were further compared: a static approach, based on a retrospective estimation of the TTP, and a dynamic approach, focused on a predictive estimation of the TTP which is determined with a constantly updated Winters forecasting model. Findings reveal that considering the minimization of TTP is the best hospital selection policy. It allows to significantly reducing service throughput times in the ED with a negligible increase in travel time. Furthermore, an immediate view of the saturation state of the ED is produced and the case mix present in the ED structures (i.e., the different triage codes) is considered, as different severity codes correspond to different service throughput times. Besides, the use of a predictive approach is certainly more reliable in terms on TTP estimation, than a retrospective approach. These considerations can support decision-makers in introducing different hospital selection policies to enhance EMSs performance.

Keywords: Emergency medical services, hospital selection, discrete event simulation, forecast model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 198
8122 Idiopathic Constipation can be Subdivided in Clinical Subtypes: Data Mining by Cluster Analysis on a Population based Study

Authors: Mauro Giacomini, Stefania Bertone, Carlo Mansi, Pietro Dulbecco, Vincenzo Savarino

Abstract:

The prevalence of non organic constipation differs from country to country and the reliability of the estimate rates is uncertain. Moreover, the clinical relevance of subdividing the heterogeneous functional constipation disorders into pre-defined subgroups is largely unknown.. Aim: to estimate the prevalence of constipation in a population-based sample and determine whether clinical subgroups can be identified. An age and gender stratified sample population from 5 Italian cities was evaluated using a previously validated questionnaire. Data mining by cluster analysis was used to determine constipation subgroups. Results: 1,500 complete interviews were obtained from 2,083 contacted households (72%). Self-reported constipation correlated poorly with symptombased constipation found in 496 subjects (33.1%). Cluster analysis identified four constipation subgroups which correlated to subgroups identified according to pre-defined symptom criteria. Significant differences in socio-demographics and lifestyle were observed among subgroups.

Keywords: Cluster analysis, constipation, data mining, statistical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1283
8121 Automated Particle Picking based on Correlation Peak Shape Analysis and Iterative Classification

Authors: Hrabe Thomas, Beck Florian, Nickell Stephan

Abstract:

Cryo-electron microscopy (CEM) in combination with single particle analysis (SPA) is a widely used technique for elucidating structural details of macromolecular assemblies at closeto- atomic resolutions. However, development of automated software for SPA processing is still vital since thousands to millions of individual particle images need to be processed. Here, we present our workflow for automated particle picking. Our approach integrates peak shape analysis to the classical correlation and an iterative approach to separate macromolecules and background by classification. This particle selection workflow furthermore provides a robust means for SPA with little user interaction. Processing simulated and experimental data assesses performance of the presented tools.

Keywords: Cryo-electron Microscopy, Single Particle Analysis, Image Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1653
8120 An Investigation of Shipping Comb Failures due to usage in Manufacturing Processes using RCFA and FMEA

Authors: Atjanakul W, Chutima S., Kamnerdthong T.

Abstract:

Shipping comb is mounted on Head Stack Assembly (HSA) to prevent collision of the heads, maintain the gap between suspensions and protect HSA tips from unintentional contact damaged in the manufacturing process. Failure analysis of shipping comb in hard disk drive production processes is proposed .Field observations were performed to determine the fatal areas on shipping comb and their failure fraction. Root cause failure analysis (RCFA) is applied to specify the failure causes subjected to various loading conditions. For reliability improvement, failure mode and effects analysis (FMEA) procedure to evaluate the risk priority is performed. Consequently, the more suitable information design criterions were obtained.

Keywords: Shipping comb, Hard disk drive, Root cause failureanalysis, Failure mode and effects analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1623
8119 Harmonic Analysis of 240 V AC Power Supply using TMS320C6713 DSK

Authors: Dody Ismoyo, Mohammad Awan, Norashikin Yahya

Abstract:

The presence of harmonic in power system is a major concerned to power engineers for many years. With the increasing usage of nonlinear loads in power systems, the harmonic pollution becomes more serious. One of the widely used computation algorithm for harmonic analysis is fast Fourier transform (FFT). In this paper, a harmonic analyzer using FFT was implemented on TMS320C6713 DSK. The supply voltage of 240 V 59 Hz is stepped down to 5V using a voltage divider in order to match the power rating of the DSK input. The output from the DSK was displayed on oscilloscope and Code Composer Studio™ software. This work has demonstrated the possibility of analyzing the 240V power supply harmonic content using the DSK board.

Keywords: Harmonic Analysis, DSP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3336
8118 Comparative Analysis of Vibration between Laminated Composite Plates with and without Holes under Compressive Loads

Authors: Bahi-Eddine Lahouel, Mohamed Guenfoud

Abstract:

In this study, a vibration analysis was carried out of symmetric angle-ply laminated composite plates with and without square hole when subjected to compressive loads, numerically. A buckling analysis is also performed to determine the buckling load of laminated plates. For each fibre orientation, the compression load is taken equal to 50% of the corresponding buckling load. In the analysis, finite element method (FEM) was applied to perform parametric studies, the effects of degree of orthotropy and stacking sequence upon the fundamental frequencies and buckling loads are discussed. The results show that the presence of a constant compressive load tends to reduce uniformly the natural frequencies for materials which have a low degree of orthotropy. However, this reduction becomes non-uniform for materials with a higher degree of orthotropy.

Keywords: Vibration, Buckling, Cutout, Laminated composite, FEM

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2040