Search results for: level set method.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10731

Search results for: level set method.

8091 Compressible Lattice Boltzmann Method for Turbulent Jet Flow Simulations

Authors: K. Noah, F.-S. Lien

Abstract:

In Computational Fluid Dynamics (CFD), there are a variety of numerical methods, of which some depend on macroscopic model representatives. These models can be solved by finite-volume, finite-element or finite-difference methods on a microscopic description. However, the lattice Boltzmann method (LBM) is considered to be a mesoscopic particle method, with its scale lying between the macroscopic and microscopic scales. The LBM works well for solving incompressible flow problems, but certain limitations arise from solving compressible flows, particularly at high Mach numbers. An improved lattice Boltzmann model for compressible flow problems is presented in this research study. A higher-order Taylor series expansion of the Maxwell equilibrium distribution function is used to overcome limitations in LBM when solving high-Mach-number flows. Large eddy simulation (LES) is implemented in LBM to simulate turbulent jet flows. The results have been validated with available experimental data for turbulent compressible free jet flow at subsonic speeds.

Keywords: Compressible lattice Boltzmann metho-, large eddy simulation, turbulent jet flows.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 955
8090 Reducing Uncertainty of Monte Carlo Estimated Fatigue Damage in Offshore Wind Turbines Using FORM

Authors: Jan-Tore H. Horn, Jørgen Juncher Jensen

Abstract:

Uncertainties related to fatigue damage estimation of non-linear systems are highly dependent on the tail behaviour and extreme values of the stress range distribution. By using a combination of the First Order Reliability Method (FORM) and Monte Carlo simulations (MCS), the accuracy of the fatigue estimations may be improved for the same computational efforts. The method is applied to a bottom-fixed, monopile-supported large offshore wind turbine, which is a non-linear and dynamically sensitive system. Different curve fitting techniques to the fatigue damage distribution have been used depending on the sea-state dependent response characteristics, and the effect of a bi-linear S-N curve is discussed. Finally, analyses are performed on several environmental conditions to investigate the long-term applicability of this multistep method. Wave loads are calculated using state-of-the-art theory, while wind loads are applied with a simplified model based on rotor thrust coefficients.

Keywords: Fatigue damage, FORM, monopile, monte carlo simulation, reliability, wind turbine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1189
8089 Study on Two Way Reinforced Concrete Slab Using ANSYS with Different Boundary Conditions and Loading

Authors: A. Gherbi, L. Dahmani, A. Boudjemia

Abstract:

This paper presents the Finite Element Method (FEM) for analyzing the failure pattern of rectangular slab with various edge conditions. Non-Linear static analysis is carried out using ANSYS 15 Software. Using SOLID65 solid elements, the compressive crushing of concrete is facilitated using plasticity algorithm, while the concrete cracking in tension zone is accommodated by the nonlinear material model. Smeared reinforcement is used and introduced as a percentage of steel embedded in concrete slab. The behavior of the analyzed concrete slab has been observed in terms of the crack pattern and displacement for various loading and boundary conditions. The finite element results are also compared with the experimental data. One of the other objectives of the present study is to show how similar the crack path found by ANSYS program to those observed for the yield line analysis. The smeared reinforcement method is found to be more practical especially for the layered elements like concrete slabs. The value of this method is that it does not require explicit modeling of the rebar, and thus a much coarser mesh can be defined.

Keywords: ANSYS, cracking pattern, displacements, RC Slab, smeared reinforcement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1272
8088 Controlling 6R Robot by Visionary System

Authors: Azamossadat Nourbakhsh, Moharram Habibnezhad Korayem

Abstract:

In the visual servoing systems, the data obtained by Visionary is used for controlling robots. In this project, at first the simulator which was proposed for simulating the performance of a 6R robot before, was examined in terms of software and test, and in the proposed simulator, existing defects were obviated. In the first version of simulation, the robot was directed toward the target object only in a Position-based method using two cameras in the environment. In the new version of the software, three cameras were used simultaneously. The camera which is installed as eye-inhand on the end-effector of the robot is used for visual servoing in a Feature-based method. The target object is recognized according to its characteristics and the robot is directed toward the object in compliance with an algorithm similar to the function of human-s eyes. Then, the function and accuracy of the operation of the robot are examined through Position-based visual servoing method using two cameras installed as eye-to-hand in the environment. Finally, the obtained results are tested under ANSI-RIA R15.05-2 standard.

Keywords: 6R Robot , camera, visual servoing, Feature-based visual servoing, Position-based visual servoing, Performance tests.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1385
8087 The Measurement of Latvian and Russian Ethnic Attitudes, Using Evaluative Priming Task and Self-Report Methods

Authors: Maria Bambulyaka, Irina Plotka, Nina Blumenau, Dmitry Igonin, Elena Ozola, Laura Shimane

Abstract:

The purposes of researches - to estimate implicit ethnic attitudes by direct and indirect methods, to determine the accordance of two types measuring, to investigate influence of task type used in an experiment, on the results of measuring, as well as to determine a presence or communication between recent episodic events and chronologic correlations of ethnic attitudes. Method of the implicit measuring - an evaluative priming (EPT) carried out with the use of different SOA intervals, explicit methods of research are G.Soldatova-s types of ethnic identity, G.Soldatova-s index of tolerance, E.Bogardus scale of social distance. During five stages of researches received results open some aspects of implicit measuring, its correlation with the results of self-reports on different SOA intervals, connection of implicit measuring with emotional valence of episodic events of participants and other indexes, presenting a contribution to the decision of implicit measuring application problem for study of different social constructs

Keywords: Ethnic attitudes, explicit method, implicit method, priming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1592
8086 Variation in the Traditional Knowledge of Curcuma longa L. in North-Eastern Algeria

Authors: A. Bouzabata, A. Boukhari

Abstract:

Curcuma longa L. (Zingiberaceae), commonly known as turmeric, has a long history of traditional uses for culinary purposes as a spice and a food colorant. The present study aimed to document the ethnobotanical knowledge about Curcuma longa, and to assess the variation in the herbalists’ experience in Northeastern Algeria. Data were collected using semi-structured questionnaires and direct interviews with 30 herbalists. Ethnobotanical indices, including the fidelity level (FL%), the relative frequency citation (RFC), and use value (UV) were determined by quantitative methods. Diversity in the level of knowledge was analyzed using univariate, non-parametric, and multivariate statistical methods. Three main categories of uses were recorded for C. longa: for food, for medicine, and for cosmetic purposes. As a medicine, turmeric was used for the treatment of gastrointestinal, dermatological, and hepatic diseases. Medicinal and food uses were correlated with both forms of preparation (rhizome and powder). The age group did not influence the use. Multivariate analyses showed a significant variation in traditional knowledge, associated with the use value, origin, quality, and efficacy of the drug. The findings suggested that the geographical origin of C. longa affected the use in Algeria.

Keywords: Curcuma longa, curcuma indices, ethnobotanical knowledge, variation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2597
8085 VISMA: A Method for System Analysis in Early Lifecycle Phases

Authors: Walter Sebron, Hans Tschürtz, Peter Krebs

Abstract:

The choice of applicable analysis methods in safety or systems engineering depends on the depth of knowledge about a system, and on the respective lifecycle phase. However, the analysis method chain still shows gaps as it should support system analysis during the lifecycle of a system from a rough concept in pre-project phase until end-of-life. This paper’s goal is to discuss an analysis method, the VISSE Shell Model Analysis (VISMA) method, which aims at closing the gap in the early system lifecycle phases, like the conceptual or pre-project phase, or the project start phase. It was originally developed to aid in the definition of the system boundary of electronic system parts, like e.g. a control unit for a pump motor. Furthermore, it can be also applied to non-electronic system parts. The VISMA method is a graphical sketch-like method that stratifies a system and its parts in inner and outer shells, like the layers of an onion. It analyses a system in a two-step approach, from the innermost to the outermost components followed by the reverse direction. To ensure a complete view of a system and its environment, the VISMA should be performed by (multifunctional) development teams. To introduce the method, a set of rules and guidelines has been defined in order to enable a proper shell build-up. In the first step, the innermost system, named system under consideration (SUC), is selected, which is the focus of the subsequent analysis. Then, its directly adjacent components, responsible for providing input to and receiving output from the SUC, are identified. These components are the content of the first shell around the SUC. Next, the input and output components to the components in the first shell are identified and form the second shell around the first one. Continuing this way, shell by shell is added with its respective parts until the border of the complete system (external border) is reached. Last, two external shells are added to complete the system view, the environment and the use case shell. This system view is also stored for future use. In the second step, the shells are examined in the reverse direction (outside to inside) in order to remove superfluous components or subsystems. Input chains to the SUC, as well as output chains from the SUC are described graphically via arrows, to highlight functional chains through the system. As a result, this method offers a clear and graphical description and overview of a system, its main parts and environment; however, the focus still remains on a specific SUC. It helps to identify the interfaces and interfacing components of the SUC, as well as important external interfaces of the overall system. It supports the identification of the first internal and external hazard causes and causal chains. Additionally, the method promotes a holistic picture and cross-functional understanding of a system, its contributing parts, internal relationships and possible dangers within a multidisciplinary development team.

Keywords: Analysis methods, functional safety, hazard identification, system and safety engineering, system boundary definition, system safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1135
8084 Method for Concept Labeling Based on Mapping between Ontology and Thesaurus

Authors: Kazuki Sonoda, Masahiro Hori

Abstract:

When designing information systems that deal with large amount of domain knowledge, system designers need to consider ambiguities of labeling termsin domain vocabulary for navigating users in the information space. The goal of this study is to develop a methodology for system designers to label navigation items, taking account of ambiguities stems from synonyms or polysemes of labeling terms. In this paper, we propose a method for concept labeling based on mappings between domain ontology andthesaurus, and report results of an empirical evaluation.

Keywords: Concept Labeling, Ontology, Thesaurus, VocabularyProblem

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1339
8083 Comparison of Power Generation Status of Photovoltaic Systems under Different Weather Conditions

Authors: Zhaojun Wang, Zongdi Sun, Qinqin Cui, Xingwan Ren

Abstract:

Based on multivariate statistical analysis theory, this paper uses the principal component analysis method, Mahalanobis distance analysis method and fitting method to establish the photovoltaic health model to evaluate the health of photovoltaic panels. First of all, according to weather conditions, the photovoltaic panel variable data are classified into five categories: sunny, cloudy, rainy, foggy, overcast. The health of photovoltaic panels in these five types of weather is studied. Secondly, a scatterplot of the relationship between the amount of electricity produced by each kind of weather and other variables was plotted. It was found that the amount of electricity generated by photovoltaic panels has a significant nonlinear relationship with time. The fitting method was used to fit the relationship between the amount of weather generated and the time, and the nonlinear equation was obtained. Then, using the principal component analysis method to analyze the independent variables under five kinds of weather conditions, according to the Kaiser-Meyer-Olkin test, it was found that three types of weather such as overcast, foggy, and sunny meet the conditions for factor analysis, while cloudy and rainy weather do not satisfy the conditions for factor analysis. Therefore, through the principal component analysis method, the main components of overcast weather are temperature, AQI, and pm2.5. The main component of foggy weather is temperature, and the main components of sunny weather are temperature, AQI, and pm2.5. Cloudy and rainy weather require analysis of all of their variables, namely temperature, AQI, pm2.5, solar radiation intensity and time. Finally, taking the variable values in sunny weather as observed values, taking the main components of cloudy, foggy, overcast and rainy weather as sample data, the Mahalanobis distances between observed value and these sample values are obtained. A comparative analysis was carried out to compare the degree of deviation of the Mahalanobis distance to determine the health of the photovoltaic panels under different weather conditions. It was found that the weather conditions in which the Mahalanobis distance fluctuations ranged from small to large were: foggy, cloudy, overcast and rainy.

Keywords: Fitting, principal component analysis, Mahalanobis distance, SPSS, MATLAB.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 675
8082 Egg Production Performance of Old Laying Hen Fed Dietary Turmeric Powder

Authors: D. P. Rahardja, M. Rahman Hakim, V. Sri Lestari

Abstract:

An experiment was conducted to elucidate the effects of turmeric powder supplementation on egg production performance of old laying hens (80 weeks of age). There were 40 hens of Hysex Brown strain used in the study. They were caged individually, and randomly divided into 4 treatment groups of diet containing 0 (control), 1, 2 and 4 % oven dried turmeric powder for 3 periods of 4 weeks; Egg production (% hen day) and feed intake of the 4 treatment groups at the commencement of the experiment were not significantly different. In addition to egg production performance (% and egg weight), feed and water intakes were measured daily, and cholesterol content of the whole egg was determined. The results indicated that feed intakes of the hen were significantly lowered when 4% turmeric powder supplemented, while there were no significant changes in water intakes. Egg production were significantly increased and maintained at a higher level by turmeric powder supplementation up to 4% compared with the control, while the weight of eggs were not significantly affected. The research markedly demonstrated that supplementation of turmeric powder up to 4% could improve and maintain egg production performance of the old laying hen at a higher level with a lower cholesterol content. 

Keywords: Curcumin, feed and water intake, old laying hen, egg production.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3526
8081 Oil Recovery Study by Low Temperature Carbon Dioxide Injection in High-Pressure High-Temperature Micromodels

Authors: Zakaria Hamdi, Mariyamni Awang

Abstract:

For the past decades, CO2 flooding has been used as a successful method for enhanced oil recovery (EOR). However, high mobility ratio and fingering effect are considered as important drawbacka of this process. Low temperature injection of CO2 into high temperature reservoirs may improve the oil recovery, but simulating multiphase flow in the non-isothermal medium is difficult, and commercial simulators are very unstable in these conditions. Furthermore, to best of authors’ knowledge, no experimental work was done to verify the results of the simulations and to understand the pore-scale process. In this paper, we present results of investigations on injection of low temperature CO2 into a high-pressure high-temperature micromodel with injection temperature range from 34 to 75 °F. Effect of temperature and saturation changes of different fluids are measured in each case. The results prove the proposed method. The injection of CO2 at low temperatures increased the oil recovery in high temperature reservoirs significantly. Also, CO2 rich phases available in the high temperature system can affect the oil recovery through the better sweep of the oil which is initially caused by penetration of LCO2 inside the system. Furthermore, no unfavorable effect was detected using this method. Low temperature CO2 is proposed to be used as early as secondary recovery.

Keywords: Enhanced oil recovery, CO2 flooding, micromodel studies, miscible flooding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1148
8080 Positive Solutions for a Class of Semipositone Discrete Boundary Value Problems with Two Parameters

Authors: Benshi Zhu

Abstract:

In this paper, the existence, multiplicity and noexistence of positive solutions for a class of semipositone discrete boundary value problems with two parameters is studied by applying nonsmooth critical point theory and sub-super solutions method.

Keywords: Discrete boundary value problems, nonsmoothcritical point theory, positive solutions, semipositone, sub-super solutions method

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1343
8079 Spatial Pattern and GIS-Based Model for Risk Assessment – A Case Study of Dusit District, Bangkok

Authors: Morakot Worachairungreung

Abstract:

The objectives of the research are to study patterns of fire location distribution and develop techniques of Geographic Information System application in fire risk assessment for fire planning and management. Fire risk assessment was based on two factors: the vulnerability factor such as building material types, building height, building density and capacity for mitigation factor such as accessibility by road, distance to fire station, distance to hydrants and it was obtained from four groups of stakeholders including firemen, city planners, local government officers and local residents. Factors obtained from all stakeholders were converted into Raster data of GIS and then were superimposed on the data in order to prepare fire risk map of the area showing level of fire risk ranging from high to low. The level of fire risk was obtained from weighted mean of each factor based on the stakeholders. Weighted mean for each factor was obtained by Analytical Hierarchy Analysis.

Keywords: Fire Risk Assessment, Geographic Information System: GIS, Raster Analysis and Analytical Hierarchy Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2207
8078 Facile Synthesis of Vertically Aligned ZnO Nanowires on Carbon Layer by Vapour Deposition

Authors: Kh. A. Abdullin, N. B. Bakranov, S. E. Kudaibergenov, S.E. Kumekov, V. N. Ermolaev, L. V. Podrezova

Abstract:

A facile vapour deposition method of synthesis of vertically aligned ZnO nanowires on carbon seed layer was developed. The received samples were investigated on electronic microscope JSM-6490 LA JEOL and x-ray diffractometer X, pert MPD PRO. The photoluminescence spectra (PL) of obtained ZnO samples at a room temperature were studied using He-Cd laser (325 nm line) as excitation source.

Keywords: ZnO nanowires, vapor-phase deposition, Nicatalytic layer, facile method of synthesis, carbon catalytic layer, thephotoluminescence spectra, X-ray spectrum.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1454
8077 A new Configurable Decimation Filter using Pascal-s Triangle Theorem

Authors: A. Chahardah Cherik, E. Farshidi

Abstract:

This paper presents a new configurable decimation filter for sigma-delta modulators. The filter employs the Pascal-s triangle-s theorem for building the coefficients of non-recursive decimation filters. The filter can be connected to the back-end of various modulators with different output accuracy. In this work two methods are shown and then compared from area occupation viewpoint. First method uses the memory and the second one employs Pascal-s triangle-s method, aiming to reduce required gates. XILINX ISE v10 is used for implementation and confirmation the filter.

Keywords: Decimation filter, sigma delta, Pascal's triangle'stheorem, memory

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1684
8076 Ensembling Adaptively Constructed Polynomial Regression Models

Authors: Gints Jekabsons

Abstract:

The approach of subset selection in polynomial regression model building assumes that the chosen fixed full set of predefined basis functions contains a subset that is sufficient to describe the target relation sufficiently well. However, in most cases the necessary set of basis functions is not known and needs to be guessed – a potentially non-trivial (and long) trial and error process. In our research we consider a potentially more efficient approach – Adaptive Basis Function Construction (ABFC). It lets the model building method itself construct the basis functions necessary for creating a model of arbitrary complexity with adequate predictive performance. However, there are two issues that to some extent plague the methods of both the subset selection and the ABFC, especially when working with relatively small data samples: the selection bias and the selection instability. We try to correct these issues by model post-evaluation using Cross-Validation and model ensembling. To evaluate the proposed method, we empirically compare it to ABFC methods without ensembling, to a widely used method of subset selection, as well as to some other well-known regression modeling methods, using publicly available data sets.

Keywords: Basis function construction, heuristic search, modelensembles, polynomial regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1673
8075 A Method to Compute Efficient 3D Helicopters Flight Trajectories Based on a Motion Polymorph-Primitives Algorithm

Authors: Konstanca Nikolajevic, Nicolas Belanger, David Duvivier, Rabie Ben Atitallah, Abdelhakim Artiba

Abstract:

Finding the optimal 3D path of an aerial vehicle under flight mechanics constraints is a major challenge, especially when the algorithm has to produce real time results in flight. Kinematics models and Pythagorian Hodograph curves have been widely used in mobile robotics to solve this problematic. The level of difficulty is mainly driven by the number of constraints to be saturated at the same time while minimizing the total length of the path. In this paper, we suggest a pragmatic algorithm capable of saturating at the same time most of dimensioning helicopter 3D trajectories’ constraints like: curvature, curvature derivative, torsion, torsion derivative, climb angle, climb angle derivative, positions. The trajectories generation algorithm is able to generate versatile complex 3D motion primitives feasible by a helicopter with parameterization of the curvature and the climb angle. An upper ”motion primitives’ concatenation” algorithm is presented based. In this article we introduce a new way of designing three-dimensional trajectories based on what we call the ”Dubins gliding symmetry conjecture”. This extremely performing algorithm will be soon integrated to a real-time decisional system dealing with inflight safety issues.

Keywords: Aerial robots, Motion primitives, Robotics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2181
8074 Nonlinear Dynamic Analysis of Base-Isolated Structures Using a Mixed Integration Method: Stability Aspects and Computational Efficiency

Authors: Nicolò Vaiana, Filip C. Filippou, Giorgio Serino

Abstract:

In order to reduce numerical computations in the nonlinear dynamic analysis of seismically base-isolated structures, a Mixed Explicit-Implicit time integration Method (MEIM) has been proposed. Adopting the explicit conditionally stable central difference method to compute the nonlinear response of the base isolation system, and the implicit unconditionally stable Newmark’s constant average acceleration method to determine the superstructure linear response, the proposed MEIM, which is conditionally stable due to the use of the central difference method, allows to avoid the iterative procedure generally required by conventional monolithic solution approaches within each time step of the analysis. The main aim of this paper is to investigate the stability and computational efficiency of the MEIM when employed to perform the nonlinear time history analysis of base-isolated structures with sliding bearings. Indeed, in this case, the critical time step could become smaller than the one used to define accurately the earthquake excitation due to the very high initial stiffness values of such devices. The numerical results obtained from nonlinear dynamic analyses of a base-isolated structure with a friction pendulum bearing system, performed by using the proposed MEIM, are compared to those obtained adopting a conventional monolithic solution approach, i.e. the implicit unconditionally stable Newmark’s constant acceleration method employed in conjunction with the iterative pseudo-force procedure. According to the numerical results, in the presented numerical application, the MEIM does not have stability problems being the critical time step larger than the ground acceleration one despite of the high initial stiffness of the friction pendulum bearings. In addition, compared to the conventional monolithic solution approach, the proposed algorithm preserves its computational efficiency even when it is adopted to perform the nonlinear dynamic analysis using a smaller time step.

Keywords: Base isolation, computational efficiency, mixed explicit-implicit method, partitioned solution approach, stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1062
8073 Wasteless Solid-Phase Method for Conversion of Iron Ores Contaminated with Silicon and Phosphorus Compounds

Authors: А. V. Panko, Е. V. Ablets, I. G. Kovzun, М. А. Ilyashov

Abstract:

Based upon generalized analysis of modern know-how in the sphere of processing, concentration and purification of iron-ore raw materials (IORM), in particular, the most widespread ferrioxide-silicate materials (FOSM), containing impurities of phosphorus and other elements compounds, noted special role of nanotechnological initiatives in improvement of such processes. Considered ideas of role of nanoparticles in processes of FOSM carbonization with subsequent direct reduction of ferric oxides contained in them to metal phase, as well as in processes of alkali treatment and separation of powered iron from phosphorus compounds. Using the obtained results the wasteless method of solid-phase processing, concentration and purification of IORM and FOSM from compounds of phosphorus, silicon and other impurities was developed and it excels known methods of direct iron reduction from iron ores and metallurgical slimes.

Keywords: Iron ores, solid-phase reduction, nanoparticles in reduction and purification of iron from silicon and phosphorus, wasteless method of ores processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1796
8072 Continuous Functions Modeling with Artificial Neural Network: An Improvement Technique to Feed the Input-Output Mapping

Authors: A. Belayadi, A. Mougari, L. Ait-Gougam, F. Mekideche-Chafa

Abstract:

The artificial neural network is one of the interesting techniques that have been advantageously used to deal with modeling problems. In this study, the computing with artificial neural network (CANN) is proposed. The model is applied to modulate the information processing of one-dimensional task. We aim to integrate a new method which is based on a new coding approach of generating the input-output mapping. The latter is based on increasing the neuron unit in the last layer. Accordingly, to show the efficiency of the approach under study, a comparison is made between the proposed method of generating the input-output set and the conventional method. The results illustrated that the increasing of the neuron units, in the last layer, allows to find the optimal network’s parameters that fit with the mapping data. Moreover, it permits to decrease the training time, during the computation process, which avoids the use of computers with high memory usage.

Keywords: Neural network computing, information processing, input-output mapping, training time, computers with high memory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1324
8071 Improving the Design of Blood Pressure and Blood Saturation Monitors

Authors: L. Parisi

Abstract:

A blood pressure monitor or sphygmomanometer can be either manual or automatic, employing respectively either the auscultatory method or the oscillometric method. The manual version of the sphygmomanometer involves an inflatable cuff with a stethoscope adopted to detect the sounds generated by the arterial walls to measure blood pressure in an artery. An automatic sphygmomanometer can be effectively used to monitor blood pressure through a pressure sensor, which detects vibrations provoked by oscillations of the arterial walls. The pressure sensor implemented in this device improves the accuracy of the measurements taken.

Keywords: Blood pressure, blood saturation, sensors, actuators, design improvement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3738
8070 Advanced Jet Trainer and Light Attack Aircraft Selection Using Composite Programming in Multiple Criteria Decision Making Analysis Method

Authors: C. Ardil

Abstract:

In this paper, composite programming is discussed for aircraft evaluation and selection problem using the multiple criteria decision analysis method. The decision criteria and aircraft alternatives were identified from the literature review. The importance of criteria weights was determined by the standard deviation method. The proposed model is applied to a practical decision problem for evaluating and selecting advanced jet trainer and light attack aircraft. The proposed technique gives robust and efficient results in modeling multiple criteria decisions. As a result of composite programming analysis, Hürjet, an advanced jet trainer and light attack aircraft alternative (a3), was chosen as the most suitable aircraft candidate.  

Keywords: composite programming, additive weighted model, multiplicative weighted model, multiple criteria decision making analysis, MCDMA, aircraft selection, advanced jet trainer and light attack aircraft, M-346, FA-50, Hürjet

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 488
8069 Aircraft Selection Using Multiple Criteria Decision Making Analysis Method with Different Data Normalization Techniques

Authors: C. Ardil

Abstract:

This paper presents an original application of multiple criteria decision making analysis theory to the evaluation of aircraft selection problem. The selection of an optimal, efficient and reliable fleet, network and operations planning policy is one of the most important factors in aircraft selection problem. Given that decision making in aircraft selection involves the consideration of a number of opposite criteria and possible solutions, such a selection can be considered as a multiple criteria decision making analysis problem. This study presents a new integrated approach to decision making by considering the multiple criteria utility theory and the maximal regret minimization theory methods as well as aircraft technical, economical, and environmental aspects. Multiple criteria decision making analysis method uses different normalization techniques to allow criteria to be aggregated with qualitative and quantitative data of the decision problem. Therefore, selecting a suitable normalization technique for the model is also a challenge to provide data aggregation for the aircraft selection problem. To compare the impact of different normalization techniques on the decision problem, the vector, linear (sum), linear (max), and linear (max-min) data normalization techniques were identified to evaluate aircraft selection problem. As a logical implication of the proposed approach, it enhances the decision making process through enabling the decision maker to: (i) use higher level knowledge regarding the selection of criteria weights and the proposed technique, (ii) estimate the ranking of an alternative, under different data normalization techniques and integrated criteria weights after a posteriori analysis of the final rankings of alternatives. A set of commercial passenger aircraft were considered in order to illustrate the proposed approach. The obtained results of the proposed approach were compared using Spearman's rho tests. An analysis of the final rank stability with respect to the changes in criteria weights was also performed so as to assess the sensitivity of the alternative rankings obtained by the application of different data normalization techniques and the proposed approach.

Keywords: Normalization Techniques, Aircraft Selection, Multiple Criteria Decision Making, Multiple Criteria Decision Making Analysis, MCDMA

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 590
8068 GIS-based Non-point Sources of Pollution Simulation in Cameron Highlands, Malaysia

Authors: M. Eisakhani, A. Pauzi, O. Karim, A. Malakahmad, S.R. Mohamed Kutty, M. H. Isa

Abstract:

Cameron Highlands is a mountainous area subjected to torrential tropical showers. It extracts 5.8 million liters of water per day for drinking supply from its rivers at several intake points. The water quality of rivers in Cameron Highlands, however, has deteriorated significantly due to land clearing for agriculture, excessive usage of pesticides and fertilizers as well as construction activities in rapidly developing urban areas. On the other hand, these pollution sources known as non-point pollution sources are diverse and hard to identify and therefore they are difficult to estimate. Hence, Geographical Information Systems (GIS) was used to provide an extensive approach to evaluate landuse and other mapping characteristics to explain the spatial distribution of non-point sources of contamination in Cameron Highlands. The method to assess pollution sources has been developed by using Cameron Highlands Master Plan (2006-2010) for integrating GIS, databases, as well as pollution loads in the area of study. The results show highest annual runoff is created by forest, 3.56 × 108 m3/yr followed by urban development, 1.46 × 108 m3/yr. Furthermore, urban development causes highest BOD load (1.31 × 106 kgBOD/yr) while agricultural activities and forest contribute the highest annual loads for phosphorus (6.91 × 104 kgP/yr) and nitrogen (2.50 × 105 kgN/yr), respectively. Therefore, best management practices (BMPs) are suggested to be applied to reduce pollution level in the area.

Keywords: Cameron Highlands, Land use, Non-point Sources of Pollution

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2876
8067 Image Analysis for Obturator Foramen Based on Marker-Controlled Watershed Segmentation and Zernike Moments

Authors: Seda Sahin, Emin Akata

Abstract:

Obturator Foramen is a specific structure in Pelvic bone images and recognition of it is a new concept in medical image processing. Moreover, segmentation of bone structures such as Obturator Foramen plays an essential role for clinical research in orthopedics. In this paper, we present a novel method to analyze the similarity between the substructures of the imaged region and a hand drawn template as a preprocessing step for computation of Pelvic bone rotation on hip radiographs. This method consists of integrated usage of Marker-controlled Watershed segmentation and Zernike moment feature descriptor and it is used to detect Obturator Foramen accurately. Marker-controlled Watershed segmentation is applied to separate Obturator Foramen from the background effectively. Then, Zernike moment feature descriptor is used to provide matching between binary template image and the segmented binary image for final extraction of Obturator Foramens. Finally, Pelvic bone rotation rate calculation for each hip radiograph is performed automatically to select and eliminate hip radiographs for further studies which depend on Pelvic bone angle measurements. The proposed method is tested on randomly selected 100 hip radiographs. The experimental results demonstrated that the proposed method is able to segment Obturator Foramen with 96% accuracy.

Keywords: Medical image analysis, marker-controlled watershed segmentation, segmentation of bone structures on hip radiographs, pelvic bone rotation rate, zernike moment feature descriptor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1993
8066 Robust Conversion of Chaos into an Arbitrary Periodic Motion

Authors: Abolhassan Razminia, Mohammad-Ali Sadrnia

Abstract:

One of the most attractive and important field of chaos theory is control of chaos. In this paper, we try to present a simple framework for chaotic motion control using the feedback linearization method. Using this approach, we derive a strategy, which can be easily applied to the other chaotic systems. This task presents two novel results: the desired periodic orbit need not be a solution of the original dynamics and the other is the robustness of response against parameter variations. The illustrated simulations show the ability of these. In addition, by a comparison between a conventional state feedback and our proposed method it is demonstrated that the introduced technique is more efficient.

Keywords: chaos, feedback linearization, robust control, periodic motion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1697
8065 Comparing Abused and Normal Male Students in Tehran Guidance Schools: Emphasizing the Co-Dependency of Their Mothers

Authors: Mohamad Saleh Sangin Ostadi, Esmail Safari, Somayeh Akbari, Kaveh Qaderi Bagajan

Abstract:

The aim of this study is to compare abused and normal male students in Tehran guidance schools with emphasis on the co-dependency of their mothers. The method of this study is based on survey method and comparison (Ex-Post Facto). The method of sampling is also multi-stage cluster. Accordingly, we did sampling from secondary schools of education and training in Tehran, including 12 schools with levels of first, second and third. Each of the schools represents the three – high, medium and low- economic and social conditions. In the following, three classes from every school and 20 students from each class were randomly selected. By (CTQ) abused and normal students were separated that 670 children were recognized as normal and 50 children as abused. Then, 50 children were randomly selected from normal group and compared with abused group. Using Spanned-Fischer Co-dependency Scale, we compared mothers of abused and normal students. The results showed that mothers of the abused children have higher co- dependency average comparing to the mothers of the normal children.

Keywords: Co-dependency, child abuse, abused children, parental psychological health.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1722
8064 Adaptive Kernel Principal Analysis for Online Feature Extraction

Authors: Mingtao Ding, Zheng Tian, Haixia Xu

Abstract:

The batch nature limits the standard kernel principal component analysis (KPCA) methods in numerous applications, especially for dynamic or large-scale data. In this paper, an efficient adaptive approach is presented for online extraction of the kernel principal components (KPC). The contribution of this paper may be divided into two parts. First, kernel covariance matrix is correctly updated to adapt to the changing characteristics of data. Second, KPC are recursively formulated to overcome the batch nature of standard KPCA.This formulation is derived from the recursive eigen-decomposition of kernel covariance matrix and indicates the KPC variation caused by the new data. The proposed method not only alleviates sub-optimality of the KPCA method for non-stationary data, but also maintains constant update speed and memory usage as the data-size increases. Experiments for simulation data and real applications demonstrate that our approach yields improvements in terms of both computational speed and approximation accuracy.

Keywords: adaptive method, kernel principal component analysis, online extraction, recursive algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1552
8063 A Mixed Method Investigation of the Impact of Practicum Experience on Mathematics Female Pre-Service Teachers’ Sense of Preparedness

Authors: Fatimah Alsaleh, Glenda Anthony

Abstract:

The practicum experience is a critical component of any initial teacher education (ITE) course. As well as providing a near authentic setting for pre-service teachers (PSTs) to practice in, it also plays a key role in shaping their perceptions and sense of preparedness. Nevertheless, merely including a practicum period as a compulsory part of ITE may not in itself be enough to induce feelings of preparedness and efficacy; the quality of the classroom experience must also be considered. Drawing on findings of a larger study of secondary and intermediate level mathematics PSTs’ sense of preparedness to teach, this paper examines the influence of the practicum experience in particular. The study sample comprised female mathematics PSTs who had almost completed their teaching methods course in their fourth year of ITE across 16 teacher education programs in Saudi Arabia. The impact of the practicum experience on PSTs’ sense of preparedness was investigated via a mixed-methods approach combining a survey (N = 105) and in-depth interviews with survey volunteers (N = 16). Statistical analysis in SPSS was used to explore the quantitative data, and thematic analysis was applied to the qualitative interviews data. The results revealed that the PSTs perceived the practicum experience to have played a dominant role in shaping their feelings of preparedness and efficacy. However, despite the generally positive influence of practicum, the PSTs also reported numerous challenges that lessened their feelings of preparedness. These challenges were often related to the classroom environment and the school culture. For example, about half of the PSTs indicated that the practicum schools did not have the resources available or the support necessary to help them learn the work of teaching. In particular, the PSTs expressed concerns about translating the theoretical knowledge learned at the university into practice in authentic classrooms. These challenges engendered PSTs feeling less prepared and suggest that more support from both the university and the school is needed to help PSTs develop a stronger sense of preparedness. The area in which PSTs felt least prepared was that of classroom and behavior management, although the results also indicated that PSTs only felt a moderate level of general teaching efficacy and were less confident about how to support students as learners. Again, feelings of lower efficacy were related to the dissonance between the theory presented at university and real-world classroom practice. In order to close this gap between theory and practice, PSTs expressed the wish to have more time in the practicum, and more accountability for support from school-based mentors. In highlighting the challenges of the practicum in shaping PSTs’ sense of preparedness and efficacy, the study argues that better communication between the ITE providers and the practicum schools is necessary in order to maximize the benefit of the practicum experience.

Keywords: Mathematics, practicum experience, pre-service teachers, sense of preparedness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1125
8062 Mixture Design Experiment on Flow Behaviour of O/W Emulsions as Affected by Polysaccharide Interactions

Authors: Nor Hayati Ibrahim, Yaakob B. Che Man, Chin Ping Tan, Nor Aini Idris

Abstract:

Interaction effects of xanthan gum (XG), carboxymethyl cellulose (CMC), and locust bean gum (LBG) on the flow properties of oil-in-water emulsions were investigated by a mixture design experiment. Blends of XG, CMC and LBG were prepared according to an augmented simplex-centroid mixture design (10 points) and used at 0.5% (wt/wt) in the emulsion formulations. An appropriate mathematical model was fitted to express each response as a function of the proportions of the blend components that are able to empirically predict the response to any blend of combination of the components. The synergistic interaction effect of the ternary XG:CMC:LBG blends at approximately 33-67% XG levels was shown to be much stronger than that of the binary XG:LBG blend at 50% XG level (p < 0.05). Nevertheless, an antagonistic interaction effect became significant as CMC level in blends was more than 33% (p < 0.05). Yield stress and apparent viscosity (at 10 s-1) responses were successfully fitted with a special quartic model while flow behaviour index and consistency coefficient were fitted with a full quartic model (R2 adjusted ≥ 0.90). This study found that a mixture design approach could serve as a valuable tool in better elucidating and predicting the interaction effects beyond the conventional twocomponent blends.

Keywords: O/W emulsions, flow behavior, polysaccharideinteraction, mixture design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2220