Search results for: Rotor dynamic analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10041

Search results for: Rotor dynamic analysis

7281 Analysis of Different Designed Landing Gears for a Light Aircraft

Authors: Essam A. Al-Bahkali

Abstract:

The design of a landing gear is one of the fundamental aspects of aircraft design. The need for a light weight, high strength, and stiffness characteristics coupled with techno economic feasibility are a key to the acceptability of any landing gear construction. In this paper, an approach for analyzing two different designed landing gears for an unmanned aircraft vehicle (UAV) using advanced CAE techniques will be applied. Different landing conditions have been considered for both models. The maximum principle stresses for each model along with the factor of safety are calculated for every loading condition. A conclusion is drawing about better geometry.

Keywords: Landing Gear, Model, Finite Element Analysis, Aircraft.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5359
7280 Molecular Evolutionary Analysis of Yeast Protein Interaction Network

Authors: Soichi Ogishima, Takeshi Hase, So Nakagawa, Yasuhiro Suzuki, Hiroshi Tanaka

Abstract:

To understand life as biological system, evolutionary understanding is indispensable. Protein interactions data are rapidly accumulating and are suitable for system-level evolutionary analysis. We have analyzed yeast protein interaction network by both mathematical and biological approaches. In this poster presentation, we inferred the evolutionary birth periods of yeast proteins by reconstructing phylogenetic profile. It has been thought that hub proteins that have high connection degree are evolutionary old. But our analysis showed that hub proteins are entirely evolutionary new. We also examined evolutionary processes of protein complexes. It showed that member proteins of complexes were tend to have appeared in the same evolutionary period. Our results suggested that protein interaction network evolved by modules that form the functional unit. We also reconstructed standardized phylogenetic trees and calculated evolutionary rates of yeast proteins. It showed that there is no obvious correlation between evolutionary rates and connection degrees of yeast proteins.

Keywords: Protein interaction network, evolution, modularity, evolutionary rate, connection degrees.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1343
7279 A Parametric Study: Frame Analysis Method for Masonry Arch Bridges

Authors: M. E. Rahman, D. Sujan, V. Pakrashi, P. Fanning

Abstract:

The predictability of masonry arch bridges and their behaviour is widely considered doubtful due to the lack of knowledge about the conditions of a given masonry arch bridge. The assessment methods for masonry arch bridges are MEXE, ARCHIE, RING and Frame Analysis Method. The material properties of the masonry and fill material are extremely difficult to determine accurately. Consequently, it is necessary to examine the effect of load dispersal angle through the fill material, the effect of variations in the stiffness of the masonry, the tensile strength of the masonry mortar continuum and the compressive strength of the masonry mortar continuum. It is also important to understand the effect of fill material on load dispersal angle to determine their influence on ratings. In this paper a series of parametric studies, to examine the sensitivity of assessment ratings to the various sets of input data required by the frame analysis method, are carried out.

Keywords: Arch Bridge, Frame Analyses Method, Masonry

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2169
7278 Functional Near Infrared Spectroscope for Cognition Brain Tasks by Wavelets Analysis and Neural Networks

Authors: Truong Quang Dang Khoa, Masahiro Nakagawa

Abstract:

Brain Computer Interface (BCI) has been recently increased in research. Functional Near Infrared Spectroscope (fNIRs) is one the latest technologies which utilize light in the near-infrared range to determine brain activities. Because near infrared technology allows design of safe, portable, wearable, non-invasive and wireless qualities monitoring systems, fNIRs monitoring of brain hemodynamics can be value in helping to understand brain tasks. In this paper, we present results of fNIRs signal analysis indicating that there exist distinct patterns of hemodynamic responses which recognize brain tasks toward developing a BCI. We applied two different mathematics tools separately, Wavelets analysis for preprocessing as signal filters and feature extractions and Neural networks for cognition brain tasks as a classification module. We also discuss and compare with other methods while our proposals perform better with an average accuracy of 99.9% for classification.

Keywords: functional near infrared spectroscope (fNIRs), braincomputer interface (BCI), wavelets, neural networks, brain activity, neuroimaging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2010
7277 Matrix Converter Fed Brushless DC Motor Using Field Programmable Gate Array

Authors: P. Subha Karuvelam, M. Rajaram

Abstract:

Brushless DC motors (BLDC) are widely used in industrial areas. The BLDC motors are driven either by indirect ACAC converters or by direct AC-AC converters. Direct AC-AC converters i.e. matrix converters are used in this paper to drive the three phase BLDC motor and it eliminates the bulky DC link energy storage element. A matrix converter converts the AC power supply to an AC voltage of variable amplitude and variable frequency. A control technique is designed to generate the switching pulses for the three phase matrix converter. For the control of speed of the BLDC motor a separate PI controller and Fuzzy Logic Controller (FLC) are designed and a hysteresis current controller is also designed for the control of motor torque. The control schemes are designed and tested separately. The simulation results of both the schemes are compared and contrasted in this paper. The results show that the fuzzy logic control scheme outperforms the PI control scheme in terms of dynamic performance of the BLDC motor. Simulation results are validated with the experimental results.

Keywords: Fuzzy logic controller, matrix converter, permanent magnet brushless DC motor, PI controller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1770
7276 Revea Ling Casein Micelle Dispersion under Various Ranges of Nacl: Evolution of Particles Size and Structure

Authors: Raza Hussain, Claire Gaiani, Joël Scher

Abstract:

Dispersions of casein micelles (CM) were studied at a constant protein concentration of 5 wt % in high NaCl environment ranging from 0% to 12% by Dynamic light scattering (DLS) and Fourier Transform Infrared (FTIR). The rehydration profiles obtained were interpreted in term of wetting, swelling and dispersion stages by using a turbidity method. Two behaviours were observed depending on the salt concentration. The first behaviour (low salt concentration) presents a typical rehydration profile with a significant change between 3 and 6% NaCl indicating quick wetting, swelling and long dispersion stage. On the opposite, the dispersion stage of the second behaviour (high salt concentration) was significantly shortened indicating a strong modification of the protein backbone. A salt increase result to a destabilization of the micelle and the formation of mini-micelles more or less aggregated indicating an average micelles size ranging from 100 to 200 nm. For the first time, the estimations of secondary structural elements (irregular, ß-sheet, α-helix and turn) by the Amide III assignments were correlated with results from Amide I.

Keywords: Casein, DLS, FTIR, Ionic environment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1853
7275 Uncertainty Analysis of a Hardware in Loop Setup for Testing Products Related to Building Technology

Authors: Balasundaram Prasaant, Ploix Stephane, Delinchant Benoit, Muresan Cristian

Abstract:

Hardware in Loop (HIL) testing is done to test and validate a particular product especially in building technology. When it comes to building technology, it is more important to test the products for their efficiency. The test rig in the HIL simulator may contribute to some uncertainties on measured efficiency. The uncertainties include physical uncertainties and scenario-based uncertainties. In this paper, a simple uncertainty analysis framework for an HIL setup is shown considering only the physical uncertainties. The entire modeling of the HIL setup is done in Dymola. The uncertain sources are considered based on available knowledge of the components and also on expert knowledge. For the propagation of uncertainty, Monte Carlo Simulation is used since it is the most reliable and easy to use. In this article it is shown how an HIL setup can be modeled and how uncertainty propagation can be performed on it. Such an approach is not common in building energy analysis.

Keywords: Energy in Buildings, Hardware in Loop, Modelica (Dymola), Monte Carlo Simulation, Uncertainty Propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 539
7274 Optimal Combination for Modal Pushover Analysis by Using Genetic Algorithm

Authors: K. Shakeri, M. Mohebbi

Abstract:

In order to consider the effects of the higher modes in the pushover analysis, during the recent years several multi-modal pushover procedures have been presented. In these methods the response of the considered modes are combined by the square-rootof- sum-of-squares (SRSS) rule while application of the elastic modal combination rules in the inelastic phases is no longer valid. In this research the feasibility of defining an efficient alternative combination method is investigated. Two steel moment-frame buildings denoted SAC-9 and SAC-20 under ten earthquake records are considered. The nonlinear responses of the structures are estimated by the directed algebraic combination of the weighted responses of the separate modes. The weight of the each mode is defined so that the resulted response of the combination has a minimum error to the nonlinear time history analysis. The genetic algorithm (GA) is used to minimize the error and optimize the weight factors. The obtained optimal factors for each mode in different cases are compared together to find unique appropriate weight factors for each mode in all cases.

Keywords: Genetic Algorithm, Modal Pushover, Optimalweight.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1769
7273 Sidelobe Reduction in Cognitive Radio Systems Using Hybrid Technique

Authors: Atif Elahi, Ijaz Mansoor Qureshi, Mehreen Atif, Noor Gul

Abstract:

Orthogonal frequency division multiplexing (OFDM) is one of the best candidates for dynamic spectrum access due to its flexibility of spectrum shaping. However, the high sidelobes of the OFDM signal that result in high out-of-band radiation, introduce significant interference to the users operating in its vicinity. This problem becomes more critical in cognitive radio (CR) system that enables the secondary users (SUs) users to access the spectrum holes not used by the primary users (PUs) at that time. In this paper, we present a generalized OFDM framework that has a capability of describing any sidelobe suppression techniques, despite of whether one or a number of techniques are used. Based on that framework, we propose cancellation carrier (CC) technique in conjunction with the generalized sidelobe canceller (GSC) to reduce the out-of-band radiation in the region where the licensed users are operating. Simulation results show that the proposed technique can reduce the out-of-band radiation better when compared with the existing techniques found in the literature.

Keywords: Cognitive radio, cancellation carriers, generalized sidelobe canceller, out-of-band radiation, orthogonal frequency division multiplexing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1176
7272 The Behavior of Dam Foundation Reinforced by Stone Columns: Case Study of Kissir Dam-Jijel

Authors: Toufik Karech, Abderahmen Benseghir, Tayeb Bouzid

Abstract:

This work presents a 2D numerical simulation of an earth dam to assess the behavior of its foundation after a treatment by stone columns. This treatment aims to improve the bearing capacity, to increase the mechanical properties of the soil, to accelerate the consolidation, to reduce the settlements and to eliminate the liquefaction phenomenon in case of seismic excitation. For the evaluation of the pore pressures, the position of the phreatic line and the flow network was defined, and a seepage analysis was performed with the software MIDAS Soil Works. The consolidation calculation is performed through a simulation of the actual construction stages of the dam. These analyzes were performed using the Mohr-Coulomb soil model and the results are compared with the actual measurements of settlement gauges implanted in the dam. An analysis of the bearing capacity was conducted to show the role of stone columns in improving the bearing capacity of the foundation.

Keywords: Earth dam, dam foundation, numerical simulation, stone columns, seepage analysis, consolidation, bearing capacity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1084
7271 Fragility Analysis of Weir Structure Subjected to Flooding Water Damage

Authors: Oh Hyeon Jeon, WooYoung Jung

Abstract:

In this study, seepage analysis was performed by the level difference between upstream and downstream of weir structure for safety evaluation of weir structure against flooding. Monte Carlo Simulation method was employed by considering the probability distribution of the adjacent ground parameter, i.e., permeability coefficient of weir structure. Moreover, by using a commercially available finite element program (ABAQUS), modeling of the weir structure is carried out. Based on this model, the characteristic of water seepage during flooding was determined at each water level with consideration of the uncertainty of their corresponding permeability coefficient. Subsequently, fragility function could be constructed based on this response from numerical analysis; this fragility function results could be used to determine the weakness of weir structure subjected to flooding disaster. They can also be used as a reference data that can comprehensively predict the probability of failur,e and the degree of damage of a weir structure.

Keywords: Weir structure, seepage, flood disaster fragility, probabilistic risk assessment, Monte-Carlo Simulation, permeability coefficient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1135
7270 Comparative Study on Swarm Intelligence Techniques for Biclustering of Microarray Gene Expression Data

Authors: R. Balamurugan, A. M. Natarajan, K. Premalatha

Abstract:

Microarray gene expression data play a vital in biological processes, gene regulation and disease mechanism. Biclustering in gene expression data is a subset of the genes indicating consistent patterns under the subset of the conditions. Finding a biclustering is an optimization problem. In recent years, swarm intelligence techniques are popular due to the fact that many real-world problems are increasingly large, complex and dynamic. By reasons of the size and complexity of the problems, it is necessary to find an optimization technique whose efficiency is measured by finding the near optimal solution within a reasonable amount of time. In this paper, the algorithmic concepts of the Particle Swarm Optimization (PSO), Shuffled Frog Leaping (SFL) and Cuckoo Search (CS) algorithms have been analyzed for the four benchmark gene expression dataset. The experiment results show that CS outperforms PSO and SFL for 3 datasets and SFL give better performance in one dataset. Also this work determines the biological relevance of the biclusters with Gene Ontology in terms of function, process and component.

Keywords: Particle swarm optimization, Shuffled frog leaping, Cuckoo search, biclustering, gene expression data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2645
7269 A Model of Market Segmentation for the Customers of Mellat Bank in Iran

Authors: Nader Gharibnavaz, Hossein Yazdi

Abstract:

If organizations like Mellat Bank want to identify its customer market completely to reach its specified goals, it can segment the market to offer the product package to the right segment. Our objective is to offer a segmentation model for Iran banking market in Mellat bank view. The methodology of this project is combined by “segmentation on the basis of four part-quality variables" and “segmentation on the basis of different in means". Required data are gathered from E-Systems and researcher personal observation. Finally, the research offers the organization that at first step form a four dimensional matrix with 756 segments using four variables named value-based, behavioral, activity style, and activity level, and at the second step calculate the means of profit for every cell of matrix in two distinguished work level (levels α1:normal condition and α2: high pressure condition) and compare the segments by checking two conditions that are 1- homogeneity every segment with its sub segment and 2- heterogeneity with other segments, and so it can do the necessary segmentation process. After all, the last offer (more explained by an operational example and feedback algorithm) is to test and update the model because of dynamic environment, technology, and banking system.

Keywords: market segmentation model, banking system, Mellat bank

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3264
7268 High Strain Rate Characteristics of the Advanced Blast Energy Absorbers

Authors: Martina Drdlová, Michal Frank, Jaroslav Buchar, Josef Krátký

Abstract:

The main aim of the presented experiments is to improve behaviour of sandwich structures under dynamic loading, such as crash or explosion. Several cellular materials are widely used as core of the sandwich structures and their properties influence the response of the entire element under impact load. To optimize their performance requires the characterisation of the core material behaviour at high strain rates and identification of the underlying mechanism. This work presents the study of high strain-rate characteristics of a specific porous lightweight blast energy absorbing foam using a Split Hopkinson Pressure Bar (SHPB) technique adapted to perform tests on low strength materials. Two different velocities, 15 and 30 m.s-1 were used to determine the strain sensitivity of the material. Foams were designed using two types of porous lightweight spherical raw materials with diameters of 30- 100 *m, combined with polymer matrix. Cylindrical specimens with diameter of 15 mm and length of 7 mm were prepared and loaded using a Split Hopkinson Pressure Bar apparatus to assess the relation between the composition of the material and its shock wave attenuation capacity.

Keywords: Blast, foam, microsphere, resin.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2455
7267 Evaluation and Analysis of the Secure E-Voting Authentication Preparation Scheme

Authors: Nidal F. Shilbayeh, Reem A. Al-Saidi, Ahmed H. Alsswey

Abstract:

In this paper, we presented an evaluation and analysis of E-Voting Authentication Preparation Scheme (EV-APS). EV-APS applies some modified security aspects that enhance the security measures and adds a strong wall of protection, confidentiality, non-repudiation and authentication requirements. Some of these modified security aspects are Kerberos authentication protocol, PVID scheme, responder certificate validation, and the converted Ferguson e-cash protocol. Authentication and privacy requirements have been evaluated and proved. Authentication guaranteed only eligible and authorized voters were permitted to vote. Also, the privacy guaranteed that all votes will be kept secret. Evaluation and analysis of some of these security requirements have been given. These modified aspects will help in filtering the counter buffer from unauthorized votes by ensuring that only authorized voters are permitted to vote.

Keywords: E-Voting preparation stage, blind signature protocol, nonce based authentication scheme, Kerberos authentication protocol, pseudo voter identity scheme PVID.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1597
7266 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery

Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene

Abstract:

Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.

Keywords: Multi-objective decision support, analysis, data validation, freight delivery, multi-modal transportation, genetic programming methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 460
7265 Analysis of Possible Causes of Fukushima Disaster

Authors: Abid Hossain Khan, Syam Hasan, M. A. R. Sarkar

Abstract:

Fukushima disaster is one of the most publicly exposed accidents in a nuclear facility which has changed the outlook of people towards nuclear power. Some have used it as an example to establish nuclear energy as an unsafe source, while others have tried to find the real reasons behind this accident. Many papers have tried to shed light on the possible causes, some of which are purely based on assumptions while others rely on rigorous data analysis. To our best knowledge, none of the works can say with absolute certainty that there is a single prominent reason that has paved the way to this unexpected incident. This paper attempts to compile all the apparent reasons behind Fukushima disaster and tries to analyze and identify the most likely one.

Keywords: Fuel meltdown, Fukushima disaster, manmade calamity, nuclear facility, tsunami.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2146
7264 Stability Analysis of Impulsive Stochastic Fuzzy Cellular Neural Networks with Time-varying Delays and Reaction-diffusion Terms

Authors: Xinhua Zhang, Kelin Li

Abstract:

In this paper, the problem of stability analysis for a class of impulsive stochastic fuzzy neural networks with timevarying delays and reaction-diffusion is considered. By utilizing suitable Lyapunov-Krasovskii funcational, the inequality technique and stochastic analysis technique, some sufficient conditions ensuring global exponential stability of equilibrium point for impulsive stochastic fuzzy cellular neural networks with time-varying delays and diffusion are obtained. In particular, the estimate of the exponential convergence rate is also provided, which depends on system parameters, diffusion effect and impulsive disturbed intention. It is believed that these results are significant and useful for the design and applications of fuzzy neural networks. An example is given to show the effectiveness of the obtained results.

Keywords: Exponential stability, stochastic fuzzy cellular neural networks, time-varying delays, impulses, reaction-diffusion terms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1356
7263 Visualization of Quantitative Thresholds in Stocks

Authors: Siddhant Sahu, P. James Daniel Paul

Abstract:

Technical analysis comprised by various technical indicators is a holistic way of representing price movement of stocks in the market. Various forms of indicators have evolved from the primitive ones in the past decades. There have been many attempts to introduce volume as a major determinant to determine strong patterns in market forecasting. The law of demand defines the relationship between the volume and price. Most of the traders are familiar with the volume game. Including the time dimension to the law of demand provides a different visualization to the theory. While attempting the same, it was found that there are different thresholds in the market for different companies. These thresholds have a significant influence on the price. This article is an attempt in determining the thresholds for companies using the three dimensional graphs for optimizing the portfolios. It also emphasizes on the magnitude of importance of volumes as a key factor for determining of predicting strong price movements, bullish and bearish markets. It uses a comprehensive data set of major companies which form a major chunk of the Indian automotive sector and are thus used as an illustration.

Keywords: Technical Analysis, Expert System, Law of demand, Stocks, Portfolio Analysis, Indian Automotive Sector.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2066
7262 Knowledge Based Wear Particle Analysis

Authors: Mohammad S. Laghari, Qurban A. Memon, Gulzar A. Khuwaja

Abstract:

The paper describes a knowledge based system for analysis of microscopic wear particles. Wear particles contained in lubricating oil carry important information concerning machine condition, in particular the state of wear. Experts (Tribologists) in the field extract this information to monitor the operation of the machine and ensure safety, efficiency, quality, productivity, and economy of operation. This procedure is not always objective and it can also be expensive. The aim is to classify these particles according to their morphological attributes of size, shape, edge detail, thickness ratio, color, and texture, and by using this classification thereby predict wear failure modes in engines and other machinery. The attribute knowledge links human expertise to the devised Knowledge Based Wear Particle Analysis System (KBWPAS). The system provides an automated and systematic approach to wear particle identification which is linked directly to wear processes and modes that occur in machinery. This brings consistency in wear judgment prediction which leads to standardization and also less dependence on Tribologists.

Keywords: Computer vision, knowledge based systems, morphology, wear particles.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1721
7261 The Design and Development of Multimedia Pronunciation Learning Management System

Authors: Fei Ping Por, Soon Fook Fong

Abstract:

The proposed Multimedia Pronunciation Learning Management System (MPLMS) in this study is a technology with profound potential for inducing improvement in pronunciation learning. The MPLMS optimizes the digitised phonetic symbols with the integration of text, sound and mouth movement video. The components are designed and developed in an online management system which turns the web to a dynamic user-centric collection of consistent and timely information for quality sustainable learning. The aim of this study is to design and develop the MPLMS which serves as an innovative tool to improve English pronunciation. This paper discusses the iterative methodology and the three-phase Alessi and Trollip model in the development of MPLMS. To align with the flexibility of the development of educational software, the iterative approach comprises plan, design, develop, evaluate and implement is followed. To ensure the instructional appropriateness of MPLMS, the instructional system design (ISD) model of Alessi and Trollip serves as a platform to guide the important instructional factors and process. It is expected that the results of future empirical research will support the efficacy of MPLMS and its place as the premier pronunciation learning system.

Keywords: Design, development, multimedia, pronunciation, learning management system

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2423
7260 Refinement of Object-Z Specifications Using Morgan-s Refinement Calculus

Authors: Mehrnaz Najafi, Hassan Haghighi

Abstract:

Morgan-s refinement calculus (MRC) is one of the well-known methods allowing the formality presented in the program specification to be continued all the way to code. On the other hand, Object-Z (OZ) is an extension of Z adding support for classes and objects. There are a number of methods for obtaining code from OZ specifications that can be categorized into refinement and animation methods. As far as we know, only one refinement method exists which refines OZ specifications into code. However, this method does not have fine-grained refinement rules and thus cannot be automated. On the other hand, existing animation methods do not present mapping rules formally and do not support the mapping of several important constructs of OZ, such as all cases of operation expressions and most of constructs in global paragraph. In this paper, with the aim of providing an automatic path from OZ specifications to code, we propose an approach to map OZ specifications into their counterparts in MRC in order to use fine-grained refinement rules of MRC. In this way, having counterparts of our specifications in MRC, we can refine them into code automatically using MRC tools such as RED. Other advantages of our work pertain to proposing mapping rules formally, supporting the mapping of all important constructs of Object-Z, and considering dynamic instantiation of objects while OZ itself does not cover this facility.

Keywords: Formal method, Formal specification, Formalprogram development, Morgan's Refinement Calculus, Object-Z

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1301
7259 Physicochemical Characterization of Waste from Vegetal Extracts Industry for Use as Briquettes

Authors: Maíra O. Palm, Cintia Marangoni, Ozair Souza, Noeli Sellin

Abstract:

Wastes from a vegetal extracts industry (cocoa, oak, Guarana and mate) were characterized by particle size, proximate and ultimate analysis, lignocellulosic fractions, high heating value, thermal analysis (Thermogravimetric analysis – TGA, and Differential thermal analysis - DTA) and energy density to evaluate their potential as biomass in the form of briquettes for power generation. All wastes presented adequate particle sizes to briquettes production. The wastes showed high moisture content, requiring previous drying for use as briquettes. Cocoa and oak wastes had the highest volatile matter contents with maximum mass loss at 310 ºC and 450 ºC, respectively. The solvents used in the aroma extraction process influenced in the moisture content of the wastes, which was higher for mate due to water has been used as solvent. All wastes showed an insignificant loss mass after 565 °C, hence resulting in low ash content. High carbon and hydrogen contents and low sulfur and nitrogen contents were observed ensuring a low generation of sulfur and nitrous oxides. Mate and cocoa exhibited the highest carbon and lignin content, and high heating value. The dried wastes had high heating value, from 17.1 MJ/kg to 20.8 MJ/kg. The results indicate the energy potential of wastes for use as fuel in power generation.

Keywords: Agro-industrial waste, biomass, briquettes, combustion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1011
7258 Selecting an Advanced Creep Model or a Sophisticated Time-Integration? A New Approach by Means of Sensitivity Analysis

Authors: Holger Keitel

Abstract:

The prediction of long-term deformations of concrete and reinforced concrete structures has been a field of extensive research and several different creep models have been developed so far. Most of the models were developed for constant concrete stresses, thus, in case of varying stresses a specific superposition principle or time-integration, respectively, is necessary. Nowadays, when modeling concrete creep the engineering focus is rather on the application of sophisticated time-integration methods than choosing the more appropriate creep model. For this reason, this paper presents a method to quantify the uncertainties of creep prediction originating from the selection of creep models or from the time-integration methods. By adapting variance based global sensitivity analysis, a methodology is developed to quantify the influence of creep model selection or choice of time-integration method. Applying the developed method, general recommendations how to model creep behavior for varying stresses are given.

Keywords: Concrete creep models, time-integration methods, sensitivity analysis, prediction uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1522
7257 Measurement and Analysis of Temperature Effects on Box Girders of Continuous Rigid Frame Bridges

Authors: Bugao Wang, Weifeng Wang, Xianwei Zeng

Abstract:

Researches on the general rules of temperature field changing and their effects on the bridge in construction are necessary. This paper investigated the rules of temperature field changing and its effects on bridge using onsite measurement and computational analysis. Guanyinsha Bridge was used as a case study in this research. The temperature field was simulated in analyses. The effects of certain boundary conditions such as sun radiance, wind speed, and model parameters such as heat factor and specific heat on temperature field are investigated. Recommended values for these parameters are proposed. The simulated temperature field matches the measured observations with high accuracy. At the same time, the stresses and deflections of the bridge computed with the simulated temperature field matches measured values too. As a conclusion, the temperature effect analysis of reinforced concrete box girder can be conducted directly based on the reliable weather data of the concerned area.

Keywords: continuous rigid frame bridge, temperature effectanalysis, temperature field, temperature field simulation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2549
7256 A Nanosensor System Based On Disuccinimydyl–CYP2E1 for Amperometric Detection of the Anti-Tuberculosis Drug, Pyrazinamide

Authors: R. F. Ajayi, U. Sidwaba, U. Feleni, S. F. Douman, E. Nxusani, L. Wilson, C. Rassie, O. Tovide, P. G. L. Baker, S. L. Vilakazi, R. Tshikhudo, E. I. Iwuoha

Abstract:

Pyrazinamide (PZA) is among the first-line pro-drugs  in the tuberculosis (TB) combination chemotherapy used to treat  Mycobacterium tuberculosis. Numerous reports have suggested that  hepatotoxicity due to pyrazinamide in patients is due to inappropriate  dosing. It is, therefore necessary to develop sensitive and reliable  techniques for determining the PZA metabolic profile of diagnosed  patients promptly and at point-of-care. This study reports the  determination of PZA based on nanobiosensor systems developed  from disuccinimidyl octanedioate modified Cytochrome P450-2E1  (CYP2E1) electrodeposited on gold substrates derivatised with  (poly(8-anilino-1-napthalene sulphonic acid) PANSA/PVP-AgNPs  nanocomposites. The rapid and sensitive amperometric PZA  detection gave a dynamic linear range of 2µM to 16µM revealing a  limit of detection of 0.044µM and a sensitivity of 1.38µA/µM. The  Michaelis-Menten parameters; KM, KM app and IMAX were calculated to  be 6.0µM, 1.41µM and 1.51x10-6 A, respectively, indicating a  nanobiosensor suitable for use in serum.

Keywords: Cytochrome P450-2E1, Disuccinimidyl octanedioate, Pyrazinamide, Tuberculosis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2203
7255 Quantifying and Adjusting the Effects of Publication Bias in Continuous Meta-Analysis

Authors: N.R.N. Idris

Abstract:

This study uses simulated meta-analysis to assess the effects of publication bias on meta-analysis estimates and to evaluate the efficacy of the trim and fill method in adjusting for these biases. The estimated effect sizes and the standard error were evaluated in terms of the statistical bias and the coverage probability. The results demonstrate that if publication bias is not adjusted it could lead to up to 40% bias in the treatment effect estimates. Utilization of the trim and fill method could reduce the bias in the overall estimate by more than half. The method is optimum in presence of moderate underlying bias but has minimal effects in presence of low and severe publication bias. Additionally, the trim and fill method improves the coverage probability by more than half when subjected to the same level of publication bias as those of the unadjusted data. The method however tends to produce false positive results and will incorrectly adjust the data for publication bias up to 45 % of the time. Nonetheless, the bias introduced into the estimates due to this adjustment is minimal

Keywords: Publication bias, Trim and Fill method, percentage relative bias, coverage probability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1532
7254 Searching the Efficient Frontier for the Coherent Covering Location Problem

Authors: Felipe Azocar Simonet, Luis Acosta Espejo

Abstract:

In this article, we will try to find an efficient boundary approximation for the bi-objective location problem with coherent coverage for two levels of hierarchy (CCLP). We present the mathematical formulation of the model used. Supported efficient solutions and unsupported efficient solutions are obtained by solving the bi-objective combinatorial problem through the weights method using a Lagrangean heuristic. Subsequently, the results are validated through the DEA analysis with the GEM index (Global efficiency measurement).

Keywords: Coherent covering location problem, efficient frontier, Lagrangian relaxation, data envelopment analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 786
7253 DC-Link Voltage Control of DC-DC Boost Converter-Inverter System with PI Controller

Authors: Thandar Aung, Tun Lin Naing

Abstract:

In this paper, the DC-link voltage control of DC-DC boost converter–inverter system is proposed. The mathematical model is developed from four different sub-circuits that depended on the switch positions. The developed differential equations are combined to develop the dynamic model. Transfer function is generated from the switched function model. Fluctuation of DC-link voltage causes connected loads malfunction. For this problem, a kind of traditional controller, the PI controller is applied to achieve constant DC-link voltage. The PI controller gains are obtained based on transfer function step response. The simulation work has been studied by using MATLAB/Simulink software and hardware prototype is implemented with a low-cost microcontroller Arduino Nano. Experimental results are collected by using ArduinoIO library package. Closed-loop DC-link voltage control system is tested with various line and load disturbances. It is found that the experimental results give equal responses with the simulation results.

Keywords: ArduinoIO library package, boost converter-inverter system, low cost microcontroller, PI controller, switched function model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1384
7252 Method of Intelligent Fault Diagnosis of Preload Loss for Single Nut Ball Screws through the Sensed Vibration Signals

Authors: Yi-Cheng Huang, Yan-Chen Shin

Abstract:

This paper proposes method of diagnosing ball screw preload loss through the Hilbert-Huang Transform (HHT) and Multiscale entropy (MSE) process. The proposed method can diagnose ball screw preload loss through vibration signals when the machine tool is in operation. Maximum dynamic preload of 2 %, 4 %, and 6 % ball screws were predesigned, manufactured, and tested experimentally. Signal patterns are discussed and revealed using Empirical Mode Decomposition(EMD)with the Hilbert Spectrum. Different preload features are extracted and discriminated using HHT. The irregularity development of a ball screw with preload loss is determined and abstracted using MSE based on complexity perception. Experiment results show that the proposed method can predict the status of ball screw preload loss. Smart sensing for the health of the ball screw is also possible based on a comparative evaluation of MSE by the signal processing and pattern matching of EMD/HHT. This diagnosis method realizes the purposes of prognostic effectiveness on knowing the preload loss and utilizing convenience.

Keywords: Empirical Mode Decomposition, Hilbert-Huang Transform, Multi-scale Entropy, Preload Loss, Single-nut Ball Screw

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2811