Search results for: Fault diagnosis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 711

Search results for: Fault diagnosis

321 Software Maintenance Severity Prediction with Soft Computing Approach

Authors: E. Ardil, Erdem Uçar, Parvinder S. Sandhu

Abstract:

As the majority of faults are found in a few of its modules so there is a need to investigate the modules that are affected severely as compared to other modules and proper maintenance need to be done on time especially for the critical applications. In this paper, we have explored the different predictor models to NASA-s public domain defect dataset coded in Perl programming language. Different machine learning algorithms belonging to the different learner categories of the WEKA project including Mamdani Based Fuzzy Inference System and Neuro-fuzzy based system have been evaluated for the modeling of maintenance severity or impact of fault severity. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provides relatively better prediction accuracy as compared to other models and hence, can be used for the maintenance severity prediction of the software.

Keywords: Software Metrics, Fuzzy, Neuro-Fuzzy, SoftwareFaults, Accuracy, MAE, RMSE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1581
320 An Artificial Neural Network Model for Earthquake Prediction and Relations between Environmental Parameters and Earthquakes

Authors: S. Niksarlioglu, F. Kulahci

Abstract:

Earthquakes are natural phenomena that occur with influence of a lot of parameters such as seismic activity, changing in the ground waters' motion, changing in the water-s temperature, etc. On the other hand, the radon gas concentrations in soil vary as nonlinear generally with earthquakes. Continuous measurement of the soil radon gas is very important for determination of characteristic of the seismic activity. The radon gas changes as continuous with strain occurring within the Earth-s surface during an earthquake and effects from the physical and the chemical processes such as soil structure, soil permeability, soil temperature, the barometric pressure, etc. Therefore, at the modeling researches are notsufficient to knowthe concentration ofradon gas. In this research, we determined relationships between radon emissions based on the environmental parameters and earthquakes occurring along the East Anatolian Fault Zone (EAFZ), Turkiye and predicted magnitudes of some earthquakes with the artificial neural network (ANN) model.

Keywords: Earthquake, Modeling, Prediction, Radon.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3012
319 A Preliminary X-Ray Study on Human-Hair Microstructures for a Health-State Indicator

Authors: Phannee Saengkaew, Weerasak Ussawawongaraya, Sasiphan Khaweerat, Supagorn Rugmai, Sirisart Ouajai, Jiraporn Luengviriya, Sakuntam Sanorpim, Manop Tirarattanasompot, Somboon Rhianphumikarakit

Abstract:

We present a preliminary x-ray study on human-hair microstructures for a health-state indicator, in particular a cancer case. As an uncomplicated and low-cost method of x-ray technique, the human-hair microstructure was analyzed by wide-angle x-ray diffractions (XRD) and small-angle x-ray scattering (SAXS). The XRD measurements exhibited the simply reflections at the d-spacing of 28 Å, 9.4 Å and 4.4 Å representing to the periodic distance of the protein matrix of the human-hair macrofibrous and the diameter and the repeated spacing of the polypeptide alpha helixes of the photofibrils of the human-hair microfibrous, respectively. When compared to the normal cases, the unhealthy cases including to the breast- and ovarian-cancer cases obtained higher normalized ratios of the x-ray diffracting peaks of 9.4 Å and 4.4 Å. This likely resulted from the varied distributions of microstructures by a molecular alteration. As an elemental analysis by x-ray fluorescence (XRF), the normalized quantitative ratios of zinc(Zn)/calcium(Ca) and iron(Fe)/calcium(Ca) were determined. Analogously, both Zn/Ca and Fe/Ca ratios of the unhealthy cases were obtained higher than both of the normal cases were. Combining the structural analysis by XRD measurements and the elemental analysis by XRF measurements exhibited that the modified fibrous microstructures of hair samples were in relation to their altered elemental compositions. Therefore, these microstructural and elemental analyses of hair samples will be benefit to associate with a diagnosis of cancer and genetic diseases. This functional method would lower a risk of such diseases by the early diagnosis. However, the high-intensity x-ray source, the highresolution x-ray detector, and more hair samples are necessarily desired to develop this x-ray technique and the efficiency would be enhanced by including the skin and fingernail samples with the human-hair analysis.

Keywords: Human-hair analysis, XRD, SAXS, breast cancer, health-state indicator

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2574
318 Comparative Analysis and Evaluation of Software Vulnerabilities Testing Techniques

Authors: Khalid Alnafjan, Tazar Hussain, Hanif Ullah, Zia ul haq Paracha

Abstract:

Software and applications are subjected to serious and damaging security threats, these threats are increasing as a result of increased number of potential vulnerabilities. Security testing is an indispensable process to validate software security requirements and to identify security related vulnerabilities. In this paper we analyze and compare different available vulnerabilities testing techniques based on a pre defined criteria using analytical hierarchy process (AHP). We have selected five testing techniques which includes Source code analysis, Fault code injection, Robustness, Stress and Penetration testing techniques. These testing techniques have been evaluated against five criteria which include cost, thoroughness, Ease of use, effectiveness and efficiency. The outcome of the study is helpful for researchers, testers and developers to understand effectiveness of each technique in its respective domain. Also the study helps to compare the inner working of testing techniques against a selected criterion to achieve optimum testing results.

Keywords: Software Security, Security Testing, Testing techniques, vulnerability, AHP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2898
317 Robust Coordinated Design of Multiple Power System Stabilizers Using Particle Swarm Optimization Technique

Authors: Sidhartha Panda, C. Ardil

Abstract:

Power system stabilizers (PSS) are now routinely used in the industry to damp out power system oscillations. In this paper, particle swarm optimization (PSO) technique is applied to coordinately design multiple power system stabilizers (PSS) in a multi-machine power system. The design problem of the proposed controllers is formulated as an optimization problem and PSO is employed to search for optimal controller parameters. By minimizing the time-domain based objective function, in which the deviation in the oscillatory rotor speed of the generator is involved; stability performance of the system is improved. The non-linear simulation results are presented for various severe disturbances and small disturbance at different locations as well as for various fault clearing sequences to show the effectiveness and robustness of the proposed controller and their ability to provide efficient damping of low frequency oscillations.

Keywords: Low frequency oscillations, Particle swarm optimization, power system stability, power system stabilizer, multimachine power system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 868
316 Secure Secret Recovery by using Weighted Personal Entropy

Authors: Leau Y. B., Dinna Nina M. N., Habeeb S. A. H., Jetol B.

Abstract:

Authentication plays a vital role in many secure systems. Most of these systems require user to log in with his or her secret password or pass phrase before entering it. This is to ensure all the valuables information is kept confidential guaranteeing also its integrity and availability. However, to achieve this goal, users are required to memorize high entropy passwords or pass phrases. Unfortunately, this sometimes causes difficulty for user to remember meaningless strings of data. This paper presents a new scheme which assigns a weight to each personal question given to the user in revealing the encrypted secrets or password. Concentration of this scheme is to offer fault tolerance to users by allowing them to forget the specific password to a subset of questions and still recover the secret and achieve successful authentication. Comparison on level of security for weight-based and weightless secret recovery scheme is also discussed. The paper concludes with the few areas that requires more investigation in this research.

Keywords: Secret Recovery, Personal Entropy, Cryptography, Secret Sharing and Key Management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1972
315 SWARM: A Meta-Scheduler to Minimize Job Queuing Times on Computational Grids

Authors: Jean-Alain Grunchec, Jules Hernández-Sánchez, Sara Knott

Abstract:

Some meta-schedulers query the information system of individual supercomputers in order to submit jobs to the least busy supercomputer on a computational Grid. However, this information can become outdated by the time a job starts due to changes in scheduling priorities. The MSR scheme is based on Multiple Simultaneous Requests and can take advantage of opportunities resulting from these priorities changes. This paper presents the SWARM meta-scheduler, which can speed up the execution of large sets of tasks by minimizing the job queuing time through the submission of multiple requests. Performance tests have shown that this new meta-scheduler is faster than an implementation of the MSR scheme and the gLite meta-scheduler. SWARM has been used through the GridQTL project beta-testing portal during the past year. Statistics are provided for this usage and demonstrate its capacity to achieve reliably a substantial reduction of the execution time in production conditions.

Keywords: Grid computing, multiple simultaneous requests, fault tolerance, GridQTL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1910
314 Thermal Securing of Electrical Contacts inside Oil Power Transformers

Authors: Ioan Rusu

Abstract:

In the operation of power transformers of 110 kV/MV from substations, these are traveled by fault current resulting from MV line damage. Defect electrical contacts are heated when they are travelled from fault currents. In the case of high temperatures when 135 °C is reached, the electrical insulating oil in the vicinity of the electrical faults comes into contact with these contacts releases gases, and activates the electrical protection. To avoid auto-flammability of electro-insulating oil, we designed a security system thermal of electrical contact defects by pouring fire-resistant polyurethane foam, mastic or mortar fire inside a cardboard electro-insulating cylinder. From practical experience, in the exploitation of power transformers of 110 kV/MT in oil electro-insulating were recorded some passing disconnecting commanded by the gas protection at internal defects. In normal operation and in the optimal load, nominal currents do not require thermal secure contacts inside electrical transformers, contacts are made at the fabrication according to the projects or to repair by solder. In the case of external short circuits close to the substation, the contacts inside electrical transformers, even if they are well made in sizes of Rcontact = 10‑6 Ω, are subjected to short-circuit currents of the order of 10 kA-20 kA which lead to the dissipation of some significant second-order electric powers, 100 W-400 W, on contact. At some internal or external factors which action on electrical contacts, including electrodynamic efforts at short-circuits, these factors could be degraded over time to values in the range of 10-4 Ω to 10-5 Ω and if the action time of protection is great, on the order of seconds, power dissipation on electrical contacts achieve high values of 1,0 kW to 40,0 kW. This power leads to strong local heating, hundreds of degrees Celsius and can initiate self-ignition and burning oil in the vicinity of electro-insulating contacts with action the gas relay. Degradation of electrical contacts inside power transformers may not be limited for the duration of their operation. In order to avoid oil burn with gas release near electrical contacts, at short-circuit currents 10 kA-20 kA, we have outlined the following solutions: covering electrical contacts in fireproof materials that would avoid direct burn oil at short circuit and transmission of heat from electrical contact along the conductors with heat dissipation gradually over time, in a large volume of cooling. Flame retardant materials are: polyurethane foam, mastic, cement (concrete). In the normal condition of operation of transformer, insulating of conductors coils is with paper and insulating oil. Ignition points of its two components respectively are approximated: 135 °C heat for oil and 200 0C for paper. In the case of a faulty electrical contact, about 10-3 Ω, at short-circuit; the temperature can reach for a short time, a value of 300 °C-400 °C, which ignite the paper and also the oil. By burning oil, there are local gases that disconnect the power transformer. Securing thermal electrical contacts inside the transformer, in cardboard tube with polyurethane foams, mastik or cement, ensures avoiding gas release and also gas protection working.

Keywords: Power transformer, oil insulatation, electric contacts, gases, gas relay.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 648
313 A Fuzzy Swarm Optimized Approach for Piece Selection in Bit Torrent Like Peer to Peer Network

Authors: M. Padmavathi, R. M. Suresh

Abstract:

Every machine plays roles of client and server simultaneously in a peer-to-peer (P2P) network. Though a P2P network has many advantages over traditional client-server models regarding efficiency and fault-tolerance, it also faces additional security threats. Users/IT administrators should be aware of risks from malicious code propagation, downloaded content legality, and P2P software’s vulnerabilities. Security and preventative measures are a must to protect networks from potential sensitive information leakage and security breaches. Bit Torrent is a popular and scalable P2P file distribution mechanism which successfully distributes large files quickly and efficiently without problems for origin server. Bit Torrent achieved excellent upload utilization according to measurement studies, but it also raised many questions as regards utilization in settings, than those measuring, fairness, and Bit Torrent’s mechanisms choice. This work proposed a block selection technique using Fuzzy ACO with optimal rules selected using ACO.

Keywords: Ant Colony Optimization (ACO), Bit Torrent, Download time, Peer-to-Peer (P2P) network, Performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2587
312 Replicating Data Objects in Large-scale Distributed Computing Systems using Extended Vickrey Auction

Authors: Samee Ullah Khan, Ishfaq Ahmad

Abstract:

This paper proposes a novel game theoretical technique to address the problem of data object replication in largescale distributed computing systems. The proposed technique draws inspiration from computational economic theory and employs the extended Vickrey auction. Specifically, players in a non-cooperative environment compete for server-side scarce memory space to replicate data objects so as to minimize the total network object transfer cost, while maintaining object concurrency. Optimization of such a cost in turn leads to load balancing, fault-tolerance and reduced user access time. The method is experimentally evaluated against four well-known techniques from the literature: branch and bound, greedy, bin-packing and genetic algorithms. The experimental results reveal that the proposed approach outperforms the four techniques in both the execution time and solution quality.

Keywords: Auctions, data replication, pricing, static allocation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1465
311 Real-Coded Genetic Algorithm for Robust Power System Stabilizer Design

Authors: Sidhartha Panda, C. Ardil

Abstract:

Power system stabilizers (PSS) are now routinely used in the industry to damp out power system oscillations. In this paper, real-coded genetic algorithm (RCGA) optimization technique is applied to design robust power system stabilizer for both singlemachine infinite-bus (SMIB) and multi-machine power system. The design problem of the proposed controller is formulated as an optimization problem and RCGA is employed to search for optimal controller parameters. By minimizing the time-domain based objective function, in which the deviation in the oscillatory rotor speed of the generator is involved; stability performance of the system is improved. The non-linear simulation results are presented under wide range of operating conditions; disturbances at different locations as well as for various fault clearing sequences to show the effectiveness and robustness of the proposed controller and their ability to provide efficient damping of low frequency oscillations.

Keywords: Particle swarm optimization, power system stabilizer, low frequency oscillations, power system stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2061
310 Investigation of Wave Atom Sub-Bands via Breast Cancer Classification

Authors: Nebi Gedik, Ayten Atasoy

Abstract:

This paper investigates successful sub-bands of wave atom transform via classification of mammograms, when the coefficients of sub-bands are used as features. A computer-aided diagnosis system is constructed by using wave atom transform, support vector machine and k-nearest neighbor classifiers. Two-class classification is studied in detail using two data sets, separately. The successful sub-bands are determined according to the accuracy rates, coefficient numbers, and sensitivity rates.

Keywords: Breast cancer, wave atom transform, SVM, k-NN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1071
309 Comparison of Multivariate Adaptive Regression Splines and Random Forest Regression in Predicting Forced Expiratory Volume in One Second

Authors: P. V. Pramila, V. Mahesh

Abstract:

Pulmonary Function Tests are important non-invasive diagnostic tests to assess respiratory impairments and provides quantifiable measures of lung function. Spirometry is the most frequently used measure of lung function and plays an essential role in the diagnosis and management of pulmonary diseases. However, the test requires considerable patient effort and cooperation, markedly related to the age of patients resulting in incomplete data sets. This paper presents, a nonlinear model built using Multivariate adaptive regression splines and Random forest regression model to predict the missing spirometric features. Random forest based feature selection is used to enhance both the generalization capability and the model interpretability. In the present study, flow-volume data are recorded for N= 198 subjects. The ranked order of feature importance index calculated by the random forests model shows that the spirometric features FVC, FEF25, PEF, FEF25-75, FEF50 and the demographic parameter height are the important descriptors. A comparison of performance assessment of both models prove that, the prediction ability of MARS with the `top two ranked features namely the FVC and FEF25 is higher, yielding a model fit of R2= 0.96 and R2= 0.99 for normal and abnormal subjects. The Root Mean Square Error analysis of the RF model and the MARS model also shows that the latter is capable of predicting the missing values of FEV1 with a notably lower error value of 0.0191 (normal subjects) and 0.0106 (abnormal subjects) with the aforementioned input features. It is concluded that combining feature selection with a prediction model provides a minimum subset of predominant features to train the model, as well as yielding better prediction performance. This analysis can assist clinicians with a intelligence support system in the medical diagnosis and improvement of clinical care.

Keywords: FEV1, Multivariate Adaptive Regression Splines Pulmonary Function Test, Random Forest.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3737
308 Wind Farm Modeling for Steady State and Dynamic Analysis

Authors: G.Kabashi, K.Kadriu, A.Gashi, S.Kabashi, G, Pula, V.Komoni

Abstract:

This paper focuses on PSS/E modeling of wind farms of Doubly-fed Induction Generator (DFIG) type and their impact on issues of power system operation. Since Wind Turbine Generators (WTG) don-t have the same characteristics as synchronous generators, the appropriate modeling of wind farms is essential for transmission system operators to analyze the best options of transmission grid reinforcements as well as to evaluate the wind power impact on reliability and security of supply. With the high excepted penetration of wind power into the power system a simultaneous loss of Wind Farm generation will put at risk power system security and reliability. Therefore, the main wind grid code requirements concern the fault ride through capability and frequency operation range of wind turbines. In case of grid faults wind turbines have to supply a definite reactive power depending on the instantaneous voltage and to return quickly to normal operation.

Keywords: Power System transients, PSS/E dynamic simulationDouble-fed Induction Generator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4624
307 A Study on Holosen-Pleistosen Sedimentology of Morphotectonic Structure and Seismicity of Gökova Bay

Authors: Ebru Aktepe Erkoç, Atilla Uluğ

Abstract:

In this research which has been prepared to show the relationship between Gökova Bay’s morphotectonic structure and seismicity, it is clear that there are many active faults in the region. The existence of a thick sedimentary accumulation since Late Quaternary times is obvious as a result of the geophysical workings in the region and the interpretation of seismic data which has been planning to be taken from the Bay. In the regions which have been tectonically active according to the interpretation of the taken data, the existence of the successive earthquakes in the last few years is remarkable. By analyzing large earthquakes affecting the areas remaining inside the sediments in West Anatolian Collapse System, this paper aims to reveal the fault systems constituting earthquakes with the information obtained from this study and to determine seismicity of the present residential areas right next to them. It is also aimed to anticipate the measures to be taken against possible earthquake hazards, to identify these areas posing a risk in terms of residential and urban planning and to determine at least partly the characteristics of the basin.

Keywords: Gökova Bay, seismic, sedimentation, West Anatolian Region.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2535
306 Atrial Fibrillation Analysis Based on Blind Source Separation in 12-lead ECG

Authors: Pei-Chann Chang, Jui-Chien Hsieh, Jyun-Jie Lin, Feng-Ming Yeh

Abstract:

Atrial Fibrillation is the most common sustained arrhythmia encountered by clinicians. Because of the invisible waveform of atrial fibrillation in atrial activation for human, it is necessary to develop an automatic diagnosis system. 12-Lead ECG now is available in hospital and is appropriate for using Independent Component Analysis to estimate the AA period. In this research, we also adopt a second-order blind identification approach to transform the sources extracted by ICA to more precise signal and then we use frequency domain algorithm to do the classification. In experiment, we gather a significant result of clinical data.

Keywords: 12-Lead ECG, Atrial Fibrillation, Blind SourceSeparation, Kurtosis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1814
305 A Delay-Tolerant Distributed Query Processing Architecture for Mobile Environment

Authors: T.P. Andamuthu, Dr. P. Balasubramanie

Abstract:

The intermittent connectivity modifies the “always on" network assumption made by all the distributed query processing systems. In modern- day systems, the absence of network connectivity is considered as a fault. Since the last upload, it might not be feasible to transmit all the data accumulated right away over the available connection. It is possible that vital information may be delayed excessively when the less important information takes place of the vital information. Owing to the restricted and uneven bandwidth, it is vital that the mobile nodes make the most advantageous use of the connectivity when it arrives. Hence, in order to select the data that needs to be transmitted first, some sort of data prioritization is essential. A continuous query processing system for intermittently connected mobile networks that comprises of a delaytolerant continuous query processor distributed across the mobile hosts has been proposed in this paper. In addition, a mechanism for prioritizing query results has been designed that guarantees enhanced accuracy and reduced delay. It is illustrated that our architecture reduces the client power consumption, increases query efficiency by the extensive simulation results.

Keywords: Broadcast, Location, Mobile host, Mobility, Query.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1450
304 Robust Power System Stabilizer Design Using Particle Swarm Optimization Technique

Authors: Sidhartha Panda, N. P. Padhy

Abstract:

Power system stabilizers (PSS) are now routinely used in the industry to damp out power system oscillations. In this paper, particle swarm optimization (PSO) technique is applied to design a robust power system stabilizer (PSS). The design problem of the proposed controller is formulated as an optimization problem and PSO is employed to search for optimal controller parameters. By minimizing the time-domain based objective function, in which the deviation in the oscillatory rotor speed of the generator is involved; stability performance of the system is improved. The non-linear simulation results are presented under wide range of operating conditions; disturbances at different locations as well as for various fault clearing sequences to show the effectiveness and robustness of the proposed controller and their ability to provide efficient damping of low frequency oscillations. Further, all the simulations results are compared with a conventionally designed power system stabilizer to show the superiority of the proposed design approach.

Keywords: Particle swarm optimization, power system stabilizer, low frequency oscillations, power system stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2358
303 Discussing Embedded versus Central Machine Learning in Wireless Sensor Networks

Authors: Anne-Lena Kampen, Øivind Kure

Abstract:

Machine learning (ML) can be implemented in Wireless Sensor Networks (WSNs) as a central solution or distributed solution where the ML is embedded in the nodes. Embedding improves privacy and may reduce prediction delay. In addition, the number of transmissions is reduced. However, quality factors such as prediction accuracy, fault detection efficiency and coordinated control of the overall system suffer. Here, we discuss and highlight the trade-offs that should be considered when choosing between embedding and centralized ML, especially for multihop networks. In addition, we present estimations that demonstrate the energy trade-offs between embedded and centralized ML. Although the total network energy consumption is lower with central prediction, it makes the network more prone for partitioning due to the high forwarding load on the one-hop nodes. Moreover, the continuous improvements in the number of operations per joule for embedded devices will move the energy balance toward embedded prediction.

Keywords: Central ML, embedded machine learning, energy consumption, local ML, Wireless Sensor Networks, WSN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 827
302 Neural Network Based Icing Identification and Fault Tolerant Control of a 340 Aircraft

Authors: F. Caliskan

Abstract:

This paper presents a Neural Network (NN) identification of icing parameters in an A340 aircraft and a reconfiguration technique to keep the A/C performance close to the performance prior to icing. Five aircraft parameters are assumed to be considerably affected by icing. The off-line training for identifying the clear and iced dynamics is based on the Levenberg-Marquard Backpropagation algorithm. The icing parameters are located in the system matrix. The physical locations of the icing are assumed at the right and left wings. The reconfiguration is based on the technique known as the control mixer approach or pseudo inverse technique. This technique generates the new control input vector such that the A/C dynamics is not much affected by icing. In the simulations, the longitudinal and lateral dynamics of an Airbus A340 aircraft model are considered, and the stability derivatives affected by icing are identified. The simulation results show the successful NN identification of the icing parameters and the reconfigured flight dynamics having the similar performance before the icing. In other words, the destabilizing icing affect is compensated.

Keywords: Aircraft Icing, Stability Derivatives, Neural NetworkIdentification, Reconfiguration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1701
301 Performance Analysis of Cluster Based Dual Tired Network Model with INTK Security Scheme in a Wireless Sensor Network

Authors: D. Satish Kumar, S. Karthik

Abstract:

A dual tiered network model is designed to overcome the problem of energy alert and fault tolerance. This model minimizes the delay time and overcome failure of links. Performance analysis of the dual tiered network model is studied in this paper where the CA and LS schemes are compared with DEO optimal. We then evaluate  the Integrated Network Topological Control and Key Management (INTK) Schemes, which was proposed to add security features of the wireless sensor networks. Clustering efficiency, level of protections, the time complexity is some of the parameters of INTK scheme that were analyzed. We then evaluate the Cluster based Energy Competent n-coverage scheme (CEC n-coverage scheme) to ensure area coverage for wireless sensor networks.

Keywords: CEC n-coverage scheme, Clustering efficiency, Dual tired network, Wireless sensor networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1672
300 On Mobile Checkpointing using Index and Time Together

Authors: Awadhesh Kumar Singh

Abstract:

Checkpointing is one of the commonly used techniques to provide fault-tolerance in distributed systems so that the system can operate even if one or more components have failed. However, mobile computing systems are constrained by low bandwidth, mobility, lack of stable storage, frequent disconnections and limited battery life. Hence, checkpointing protocols having lesser number of synchronization messages and fewer checkpoints are preferred in mobile environment. There are two different approaches, although not orthogonal, to checkpoint mobile computing systems namely, time-based and index-based. Our protocol is a fusion of these two approaches, though not first of its kind. In the present exposition, an index-based checkpointing protocol has been developed, which uses time to indirectly coordinate the creation of consistent global checkpoints for mobile computing systems. The proposed algorithm is non-blocking, adaptive, and does not use any control message. Compared to other contemporary checkpointing algorithms, it is computationally more efficient because it takes lesser number of checkpoints and does not need to compute dependency relationships. A brief account of important and relevant works in both the fields, time-based and index-based, has also been included in the presentation.

Keywords: Checkpointing, forced checkpoint, mobile computing, recovery, time-coordinated.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1488
299 A Test to Express Diagnostic Cohesion of Football Team

Authors: Alexandra O. Savinkina

Abstract:

We proposed to assess the cohesion of a football team by its subject-goal and subject-value unity according to the A.V. Petrovsky theory. Goal unity was measured by the degree of compliance of the priority targets for various players in the team. Values were estimated by the coincidence of the ideas about a perfect football player. On the basis of the provisional diagnosis of the six teams, we had made the lists of goals and values. The tests were piloted on 35 football teams. The results allowed not only to compare quantitatively the cohesion of the different teams, but also to identify subgroups within the team.

Keywords: Cohesion, football, psychodiagnostic, soccer, sports team, value-orientation unity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1162
298 Reliability Modeling and Data Analysis of Vacuum Circuit Breaker Subject to Random Shocks

Authors: Rafik Medjoudj, Rabah Medjoudj, D. Aissani

Abstract:

The electrical substation components are often subject to degradation due to over-voltage or over-current, caused by a short circuit or a lightning. A particular interest is given to the circuit breaker, regarding the importance of its function and its dangerous failure. This component degrades gradually due to the use, and it is also subject to the shock process resulted from the stress of isolating the fault when a short circuit occurs in the system. In this paper, based on failure mechanisms developments, the wear out of the circuit breaker contacts is modeled. The aim of this work is to evaluate its reliability and consequently its residual lifetime. The shock process is based on two random variables such as: the arrival of shocks and their magnitudes. The arrival of shocks was modeled using homogeneous Poisson process (HPP). By simulation, the dates of short-circuit arrivals were generated accompanied with their magnitudes. The same principle of simulation is applied to the amount of cumulative wear out contacts. The objective reached is to find the formulation of the wear function depending on the number of solicitations of the circuit breaker.

Keywords: reliability, short-circuit, models of shocks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1938
297 Layout Based Spam Filtering

Authors: Claudiu N.Musat

Abstract:

Due to the constant increase in the volume of information available to applications in fields varying from medical diagnosis to web search engines, accurate support of similarity becomes an important task. This is also the case of spam filtering techniques where the similarities between the known and incoming messages are the fundaments of making the spam/not spam decision. We present a novel approach to filtering based solely on layout, whose goal is not only to correctly identify spam, but also warn about major emerging threats. We propose a mathematical formulation of the email message layout and based on it we elaborate an algorithm to separate different types of emails and find the new, numerically relevant spam types.

Keywords: Clustering, layout, k-means, spam.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1649
296 Monte Carlo and Biophysics Analysis in a Criminal Trial

Authors: Luca Indovina, Carmela Coppola, Carlo Altucci, Riccardo Barberi, Rocco Romano

Abstract:

In this paper a real court case, held in Italy at the Court of Nola, in which a correct physical description, conducted with both a Monte Carlo and biophysical analysis, would have been sufficient to arrive at conclusions confirmed by documentary evidence, is considered. This will be an example of how forensic physics can be useful in confirming documentary evidence in order to reach hardly questionable conclusions. This was a libel trial in which the defendant, Mr. DS (Defendant for Slander), had falsely accused one of his neighbors, Mr. OP (Offended Person), of having caused him some damages. The damages would have been caused by an external plaster piece that would have detached from the neighbor’s property and would have hit Mr DS while he was in his garden, much more than a meter far away from the facade of the building from which the plaster piece would have detached. In the trial, Mr. DS claimed to have suffered a scratch on his forehead, but he never showed the plaster that had hit him, nor was able to tell from where the plaster would have arrived. Furthermore, Mr. DS presented a medical certificate with a diagnosis of contusion of the cerebral cortex. On the contrary, the images of Mr. OP’s security cameras do not show any movement in the garden of Mr. DS in a long interval of time (about 2 hours) around the time of the alleged accident, nor do they show any people entering or coming out from the house of Mr. DS in the same interval of time. Biophysical analysis shows that both the diagnosis of the medical certificate and the wound declared by the defendant, already in conflict with each other, are not compatible with the fall of external plaster pieces too small to be found. The wind was at a level 1 of the Beaufort scale, that is, unable to raise even dust (level 4 of the Beaufort scale). Therefore, the motion of the plaster pieces can be described as a projectile motion, whereas collisions with the building cornice can be treated using Newtons law of coefficients of restitution. Numerous numerical Monte Carlo simulations show that the pieces of plaster would not have been able to reach even the garden of Mr. DS, let alone a distance over 1.30 meters. Results agree with the documentary evidence (images of Mr. OP’s security cameras) that Mr. DS could not have been hit by plaster pieces coming from Mr. OP’s property.

Keywords: Biophysical analysis, Monte Carlo simulations, Newton’s law of restitution, projectile motion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 614
295 Investigations into Effect of Neural Network Predictive Control of UPFC for Improving Transient Stability Performance of Multimachine Power System

Authors: Sheela Tiwari, R. Naresh, R. Jha

Abstract:

The paper presents an investigation in to the effect of neural network predictive control of UPFC on the transient stability performance of a multimachine power system. The proposed controller consists of a neural network model of the test system. This model is used to predict the future control inputs using the damped Gauss-Newton method which employs ‘backtracking’ as the line search method for step selection. The benchmark 2 area, 4 machine system that mimics the behavior of large power systems is taken as the test system for the study and is subjected to three phase short circuit faults at different locations over a wide range of operating conditions. The simulation results clearly establish the robustness of the proposed controller to the fault location, an increase in the critical clearing time for the circuit breakers, and an improved damping of the power oscillations as compared to the conventional PI controller.

Keywords: Identification, Neural networks, Predictive control, Transient stability, UPFC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2079
294 A Support System to Consult Remote another Doctor on Assessment and/or Medical Treatment Plan when a Doctor has a Patient not in His/Her Major

Authors: T. Gotoh, T. Takayama, M. Ishiki, T. Ikeda

Abstract:

Recently, majors of doctors are divided into terribly lots of detailed areas. However, it is actually not a rare case that a doctor has a patient who is not in his/her major. He/She must judge an assessment and make a medical treatment plan for this patient. According to our investigation, conventional approaches such as image diagnosis cooperation are insufficient. This paper proposes an 'Assessment / Medical Treatment Plan Consulting System'. We have implemented a pilot system based on our proposition. Its effectiveness is clarified by an evaluation.

Keywords: Application, computational intelligence and telecommunications, medicine, intelligent systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1728
293 Memory Leak Detection in Distributed System

Authors: Roohi Shabrin S., Devi Prasad B., Prabu D., Pallavi R. S., Revathi P.

Abstract:

Due to memory leaks, often-valuable system memory gets wasted and denied for other processes thereby affecting the computational performance. If an application-s memory usage exceeds virtual memory size, it can leads to system crash. Current memory leak detection techniques for clusters are reactive and display the memory leak information after the execution of the process (they detect memory leak only after it occur). This paper presents a Dynamic Memory Monitoring Agent (DMMA) technique. DMMA framework is a dynamic memory leak detection, that detects the memory leak while application is in execution phase, when memory leak in any process in the cluster is identified by DMMA it gives information to the end users to enable them to take corrective actions and also DMMA submit the affected process to healthy node in the system. Thus provides reliable service to the user. DMMA maintains information about memory consumption of executing processes and based on this information and critical states, DMMA can improve reliability and efficaciousness of cluster computing.

Keywords: Dynamic Memory Monitoring Agent (DMMA), Cluster Computing, Memory Leak, Fault Tolerant Framework, Dynamic Memory Leak Detection (DMLD).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2284
292 Technique for Grounding System Design in Distribution Substation

Authors: N. Rugthaicharoencheep, A. Charlangsut, B. Ainsuk, A. Phayomhom

Abstract:

This paper presents the significant factor and give some suggestion that should know before design. The main objective of this paper is guide the first step for someone who attends to design of grounding system before study in details later. The overview of grounding system can protect damage from fault such as can save a human life and power system equipment. The unsafe conditions have three cases. Case 1) maximum touch voltage exceeds the safety criteria. In this case, the conductor compression ratio of the ground gird should be first adjusted to have optimal spacing of ground grid conductors. If it still over limit, earth resistivity should be consider afterward. Case 2) maximum step voltage exceeds the safety criteria. In this case, increasing the number of ground grid conductors around the boundary can solve this problem. Case 3) both of maximum touch and step voltage exceed the safety criteria. In this case, follow the solutions explained in case 1 and case 2. Another suggestion, vary depth of ground grid until maximum step and touch voltage do not exceed the safety criteria.

Keywords: Grounding System, Touch Voltage, Step Voltage, Safety Criteria.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3530