Search results for: Generalized matrix approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6128

Search results for: Generalized matrix approach

4838 CO2 Emission and Cost Optimization of Reinforced Concrete Frame Designed by Performance Based Design Approach

Authors: Jin Woo Hwang, Byung Kwan Oh, Yousok Kim, Hyo Seon Park

Abstract:

As greenhouse effect has been recognized as serious environmental problem of the world, interests in carbon dioxide (CO2) emission which comprises major part of greenhouse gas (GHG) emissions have been increased recently. Since construction industry takes a relatively large portion of total CO2 emissions of the world, extensive studies about reducing CO2 emissions in construction and operation of building have been carried out after the 2000s. Also, performance based design (PBD) methodology based on nonlinear analysis has been robustly developed after Northridge Earthquake in 1994 to assure and assess seismic performance of building more exactly because structural engineers recognized that prescriptive code based design approach cannot address inelastic earthquake responses directly and assure performance of building exactly. Although CO2 emissions and PBD approach are recent rising issues on construction industry and structural engineering, there were few or no researches considering these two issues simultaneously. Thus, the objective of this study is to minimize the CO2 emissions and cost of building designed by PBD approach in structural design stage considering structural materials. 4 story and 4 span reinforced concrete building optimally designed to minimize CO2 emissions and cost of building and to satisfy specific seismic performance (collapse prevention in maximum considered earthquake) of building satisfying prescriptive code regulations using non-dominated sorting genetic algorithm-II (NSGA-II). Optimized design result showed that minimized CO2 emissions and cost of building were acquired satisfying specific seismic performance. Therefore, the methodology proposed in this paper can be used to reduce both CO2 emissions and cost of building designed by PBD approach.

Keywords: CO2 emissions, performance based design, optimization, sustainable design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1828
4837 Generalized Method for Estimating Best-Fit Vertical Alignments for Profile Data

Authors: Said M. Easa, Shinya Kikuchi

Abstract:

When the profile information of an existing road is missing or not up-to-date and the parameters of the vertical alignment are needed for engineering analysis, the engineer has to recreate the geometric design features of the road alignment using collected profile data. The profile data may be collected using traditional surveying methods, global positioning systems, or digital imagery. This paper develops a method that estimates the parameters of the geometric features that best characterize the existing vertical alignments in terms of tangents and the expressions of the curve, that may be symmetrical, asymmetrical, reverse, and complex vertical curves. The method is implemented using an Excel-based optimization method that minimizes the differences between the observed profile and the profiles estimated from the equations of the vertical curve. The method uses a 'wireframe' representation of the profile that makes the proposed method applicable to all types of vertical curves. A secondary contribution of this paper is to introduce the properties of the equal-arc asymmetrical curve that has been recently developed in the highway geometric design field.

Keywords: Optimization, parameters, data, reverse, spreadsheet, vertical curves

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2427
4836 Bottom Up Text Mining through Hierarchical Document Representation

Authors: Y. Djouadi., F. Souam.

Abstract:

Most of the existing text mining approaches are proposed, keeping in mind, transaction databases model. Thus, the mined dataset is structured using just one concept: the “transaction", whereas the whole dataset is modeled using the “set" abstract type. In such cases, the structure of the whole dataset and the relationships among the transactions themselves are not modeled and consequently, not considered in the mining process. We believe that taking into account structure properties of hierarchically structured information (e.g. textual document, etc ...) in the mining process, can leads to best results. For this purpose, an hierarchical associations rule mining approach for textual documents is proposed in this paper and the classical set-oriented mining approach is reconsidered profits to a Direct Acyclic Graph (DAG) oriented approach. Natural languages processing techniques are used in order to obtain the DAG structure. Based on this graph model, an hierarchical bottom up algorithm is proposed. The main idea is that each node is mined with its parent node.

Keywords: Graph based association rules mining, Hierarchical document structure, Text mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2037
4835 Conditions on Blind Source Separability of Linear FIR-MIMO Systems with Binary Inputs

Authors: Jiashan Tang

Abstract:

In this note, we investigate the blind source separability of linear FIR-MIMO systems. The concept of semi-reversibility of a system is presented. It is shown that for a semi-reversible system, if the input signals belong to a binary alphabet, then the source data can be blindly separated. One sufficient condition for a system to be semi-reversible is obtained. It is also shown that the proposed criteria is weaker than that in the literature which requires that the channel matrix is irreducible/invertible or reversible.

Keywords: Blind source separable, FIR-MIMO system, Binary input, Bezout equality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1287
4834 Using Textual Pre-Processing and Text Mining to Create Semantic Links

Authors: Ricardo Avila, Gabriel Lopes, Vania Vidal, Jose Macedo

Abstract:

This article offers a approach to the automatic discovery of semantic concepts and links in the domain of Oil Exploration and Production (E&P). Machine learning methods combined with textual pre-processing techniques were used to detect local patterns in texts and, thus, generate new concepts and new semantic links. Even using more specific vocabularies within the oil domain, our approach has achieved satisfactory results, suggesting that the proposal can be applied in other domains and languages, requiring only minor adjustments.

Keywords: Semantic links, data mining, linked data, SKOS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1033
4833 BEM Formulations Based on Kirchhoffs Hypoyhesis to Perform Linear Bending Analysis of Plates Reinforced by Beams

Authors: Gabriela R. Fernandes, Renato F. Denadai, Guido J. Denipotti

Abstract:

In this work, are discussed two formulations of the boundary element method - BEM to perform linear bending analysis of plates reinforced by beams. Both formulations are based on the Kirchhoff's hypothesis and they are obtained from the reciprocity theorem applied to zoned plates, where each sub-region defines a beam or a slab. In the first model the problem values are defined along the interfaces and the external boundary. Then, in order to reduce the number of degrees of freedom kinematics hypothesis are assumed along the beam cross section, leading to a second formulation where the collocation points are defined along the beam skeleton, instead of being placed on interfaces. On these formulations no approximation of the generalized forces along the interface is required. Moreover, compatibility and equilibrium conditions along the interface are automatically imposed by the integral equation. Thus, these formulations require less approximation and the total number of the degree s of freedom is reduced. In the numerical examples are discussed the differences between these two BEM formulations, comparing as well the results to a well-known finite element code.

Keywords: Boundary elements, Building floor structures, Platebending.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1642
4832 Two Approaches to Code Mobility in an Agent-based E-commerce System

Authors: Costin Badica, Maria Ganzha, Marcin Paprzycki

Abstract:

Recently, a model multi-agent e-commerce system based on mobile buyer agents and transfer of strategy modules was proposed. In this paper a different approach to code mobility is introduced, where agent mobility is replaced by local agent creation supplemented by similar code mobility as in the original proposal. UML diagrams of agents involved in the new approach to mobility and the augmented system activity diagram are presented and discussed.

Keywords: Agent system, agent mobility, code mobility, e-commerce, UML formalization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1413
4831 A Parallel Architecture for the Real Time Correction of Stereoscopic Images

Authors: Zohir Irki, Michel Devy

Abstract:

In this paper, we will present an architecture for the implementation of a real time stereoscopic images correction's approach. This architecture is parallel and makes use of several memory blocs in which are memorized pre calculated data relating to the cameras used for the acquisition of images. The use of reduced images proves to be essential in the proposed approach; the suggested architecture must so be able to carry out the real time reduction of original images.

Keywords: Image reduction, Real-time correction, Parallel architecture, Parallel treatment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1089
4830 Taguchi-Based Six Sigma Approach to Optimize Surface Roughness for Milling Processes

Authors: Sky Chou, Joseph C. Chen

Abstract:

This paper focuses on using Six Sigma methodologies to improve the surface roughness of a manufactured part produced by the CNC milling machine. It presents a case study where the surface roughness of milled aluminum is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for a CNC milling process. The six sigma methodology, DMAIC (design, measure, analyze, improve, and control) approach, was applied in this study to improve the process, reduce defects, and ultimately reduce costs. The Taguchi-based six sigma approach was applied to identify the optimized processing parameters that led to the targeted surface roughness specified by our customer. A L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of feed rate, depth of cut, spindle speed, and surface roughness. The noise factor is the difference between the old cutting tool and the new cutting tool. The confirmation run with the optimal parameters confirmed that the new parameter settings are correct. The new settings also improved the process capability index. The purpose of this study is that the Taguchi–based six sigma approach can be efficiently used to phase out defects and improve the process capability index of the CNC milling process.

Keywords: CNC machining, Six Sigma, Surface roughness, Taguchi methodology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1034
4829 A New Self-Adaptive EP Approach for ANN Weights Training

Authors: Kristina Davoian, Wolfram-M. Lippe

Abstract:

Evolutionary Programming (EP) represents a methodology of Evolutionary Algorithms (EA) in which mutation is considered as a main reproduction operator. This paper presents a novel EP approach for Artificial Neural Networks (ANN) learning. The proposed strategy consists of two components: the self-adaptive, which contains phenotype information and the dynamic, which is described by genotype. Self-adaptation is achieved by the addition of a value, called the network weight, which depends on a total number of hidden layers and an average number of neurons in hidden layers. The dynamic component changes its value depending on the fitness of a chromosome, exposed to mutation. Thus, the mutation step size is controlled by two components, encapsulated in the algorithm, which adjust it according to the characteristics of a predefined ANN architecture and the fitness of a particular chromosome. The comparative analysis of the proposed approach and the classical EP (Gaussian mutation) showed, that that the significant acceleration of the evolution process is achieved by using both phenotype and genotype information in the mutation strategy.

Keywords: Artificial Neural Networks (ANN), Learning Theory, Evolutionary Programming (EP), Mutation, Self-Adaptation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1809
4828 A Genetic-Algorithm-Based Approach for Audio Steganography

Authors: Mazdak Zamani , Azizah A. Manaf , Rabiah B. Ahmad , Akram M. Zeki , Shahidan Abdullah

Abstract:

In this paper, we present a novel, principled approach to resolve the remained problems of substitution technique of audio steganography. Using the proposed genetic algorithm, message bits are embedded into multiple, vague and higher LSB layers, resulting in increased robustness. The robustness specially would be increased against those intentional attacks which try to reveal the hidden message and also some unintentional attacks like noise addition as well.

Keywords: Artificial Intelligence, Audio Steganography, DataHiding, Genetic Algorithm, Substitution Techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3096
4827 Integrated Subset Split for Balancing Network Utilization and Quality of Routing

Authors: S. V. Kasmir Raja, P. Herbert Raj

Abstract:

The overlay approach has been widely used by many service providers for Traffic Engineering (TE) in large Internet backbones. In the overlay approach, logical connections are set up between edge nodes to form a full mesh virtual network on top of the physical topology. IP routing is then run over the virtual network. Traffic engineering objectives are achieved through carefully routing logical connections over the physical links. Although the overlay approach has been implemented in many operational networks, it has a number of well-known scaling issues. This paper proposes a new approach to achieve traffic engineering without full-mesh overlaying with the help of integrated approach and equal subset split method. Traffic engineering needs to determine the optimal routing of traffic over the existing network infrastructure by efficiently allocating resource in order to optimize traffic performance on an IP network. Even though constraint-based routing [1] of Multi-Protocol Label Switching (MPLS) is developed to address this need, since it is not widely tested or debugged, Internet Service Providers (ISPs) resort to TE methods under Open Shortest Path First (OSPF), which is the most commonly used intra-domain routing protocol. Determining OSPF link weights for optimal network performance is an NP-hard problem. As it is not possible to solve this problem, we present a subset split method to improve the efficiency and performance by minimizing the maximum link utilization in the network via a small number of link weight modifications. The results of this method are compared against results of MPLS architecture [9] and other heuristic methods.

Keywords: Constraint based routing, Link Utilization, Subsetsplit method and Traffic Engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1377
4826 Genetic Programming Based Data Projections for Classification Tasks

Authors: César Estébanez, Ricardo Aler, José M. Valls

Abstract:

In this paper we present a GP-based method for automatically evolve projections, so that data can be more easily classified in the projected spaces. At the same time, our approach can reduce dimensionality by constructing more relevant attributes. Fitness of each projection measures how easy is to classify the dataset after applying the projection. This is quickly computed by a Simple Linear Perceptron. We have tested our approach in three domains. The experiments show that it obtains good results, compared to other Machine Learning approaches, while reducing dimensionality in many cases.

Keywords: Classification, genetic programming, projections.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1377
4825 BIBD-s for (13, 5, 5), (16, 6, 5) and (21, 6, 4) Possessing Possibly an Automorphism of Order 3

Authors: Ivica Martinjak, Mario-Osvin Pavcevic

Abstract:

When trying to enumerate all BIBD-s for given parameters, their natural solution space appears to be huge and grows extremely with the number of points of the design. Therefore, constructive enumerations are often carried out by assuming additional constraints on design-s structure, automorphisms being mostly used ones. It remains a hard task to construct designs with trivial automorphism group – those with no additional symmetry – although it is believed that most of the BIBD-s belong to that case. In this paper, very many new designs with parameters 2-(13, 5, 5), 2-(16, 6, 5) and 2-(21, 6, 4) are constructed, assuming an action of an automorphism of order 3. Even more, it was possible to construct millions of such designs with no non-trivial automorphisms.

Keywords: BIBD, incidence matrix, automorphism group, tactical decomposition, deterministic algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1292
4824 Mobility Analysis of the Population of Rabat-Salé-Zemmour-Zaer

Authors: F. Ghaiti

Abstract:

In this paper, we present the 2006 survey study origin destination and price that we carried out during 2006 fall in the area in the Moroccan region of Rabat-Salé-Zemmour-Zaer. The survey concerns the people-s characteristics, their displacements behavior and the price that they will be able to pay for a tramway ticket. The main objective is to study a set of relative features to the households and to their displacement's habits and to their choices among public and privet transport modes. A comparison between this survey results and that of the 1996's is made. A pricing scheme is also given according to the tram capacity. (The Rabat-Salé tramway is under construction right now and it will be operational beginning 2010).

Keywords: Matrix O/D, Theory of pricing, Urban transport survey.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2628
4823 Evaluation of Model-Based Code Generation for Embedded Systems–Mature Approach for Development in Evolution

Authors: Nikolay P. Brayanov, Anna V. Stoynova

Abstract:

Model-based development approach is gaining more support and acceptance. Its higher abstraction level brings simplification of systems’ description that allows domain experts to do their best without particular knowledge in programming. The different levels of simulation support the rapid prototyping, verifying and validating the product even before it exists physically. Nowadays model-based approach is beneficial for modelling of complex embedded systems as well as a generation of code for many different hardware platforms. Moreover, it is possible to be applied in safety-relevant industries like automotive, which brings extra automation of the expensive device certification process and especially in the software qualification. Using it, some companies report about cost savings and quality improvements, but there are others claiming no major changes or even about cost increases. This publication demonstrates the level of maturity and autonomy of model-based approach for code generation. It is based on a real live automotive seat heater (ASH) module, developed using The Mathworks, Inc. tools. The model, created with Simulink, Stateflow and Matlab is used for automatic generation of C code with Embedded Coder. To prove the maturity of the process, Code generation advisor is used for automatic configuration. All additional configuration parameters are set to auto, when applicable, leaving the generation process to function autonomously. As a result of the investigation, the publication compares the quality of generated embedded code and a manually developed one. The measurements show that generally, the code generated by automatic approach is not worse than the manual one. A deeper analysis of the technical parameters enumerates the disadvantages, part of them identified as topics for our future work.

Keywords: Embedded code generation, embedded C code quality, embedded systems, model-based development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 981
4822 Absorption Spectra of Artificial Atoms in Presence of THz Fields

Authors: B. Dahiya, K.Batra, V.Prasad

Abstract:

Artificial atoms are growing fields of interest due to their physical and optoelectronicapplications. The absorption spectra of the proposed artificial atom inpresence of Tera-Hertz field is investigated theoretically. We use the non-perturbativeFloquet theory and finite difference method to study the electronic structure of ArtificialAtom. The effect of static electric field on the energy levels of artificial atom is studied.The effect of orientation of static electric field on energy levels and diploe matrix elementsis also highlighted.

Keywords: Absorption spectra, Artificial atom, Floquet Theory, THz fields

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1676
4821 A Socio-Technical Approach to Cyber-Risk Assessment

Authors: Kitty Kioskli, Nineta Polemi

Abstract:

Evaluating the levels of cyber-security risks within an enterprise is most important in protecting its information system, services and all its digital assets against security incidents (e.g. accidents, malicious acts, massive cyber-attacks). The existing risk assessment methodologies (e.g. eBIOS, OCTAVE, CRAMM, NIST-800) adopt a technical approach considering as attack factors only the capability, intention and target of the attacker, and not paying attention to the attacker’s psychological profile and personality traits. In this paper, a socio-technical approach is proposed in cyber risk assessment, in order to achieve more realistic risk estimates by considering the personality traits of the attackers. In particular, based upon principles from investigative psychology and behavioural science, a multi-dimensional, extended, quantifiable model for an attacker’s profile is developed, which becomes an additional factor in the cyber risk level calculation.

Keywords: Attacker, behavioural models, cyber risk assessment, cyber-security, human factors, investigative psychology, ISO27001, ISO27005.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 930
4820 Wasteless Solid-Phase Method for Conversion of Iron Ores Contaminated with Silicon and Phosphorus Compounds

Authors: А. V. Panko, Е. V. Ablets, I. G. Kovzun, М. А. Ilyashov

Abstract:

Based upon generalized analysis of modern know-how in the sphere of processing, concentration and purification of iron-ore raw materials (IORM), in particular, the most widespread ferrioxide-silicate materials (FOSM), containing impurities of phosphorus and other elements compounds, noted special role of nanotechnological initiatives in improvement of such processes. Considered ideas of role of nanoparticles in processes of FOSM carbonization with subsequent direct reduction of ferric oxides contained in them to metal phase, as well as in processes of alkali treatment and separation of powered iron from phosphorus compounds. Using the obtained results the wasteless method of solid-phase processing, concentration and purification of IORM and FOSM from compounds of phosphorus, silicon and other impurities was developed and it excels known methods of direct iron reduction from iron ores and metallurgical slimes.

Keywords: Iron ores, solid-phase reduction, nanoparticles in reduction and purification of iron from silicon and phosphorus, wasteless method of ores processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1775
4819 Robust Control for Discrete-Time Sector Bounded Systems with Time-Varying Delay

Authors: Ju H. Park, S.M. Lee

Abstract:

In this paper, we propose a robust controller design method for discrete-time systems with sector-bounded nonlinearities and time-varying delay. Based on the Lyapunov theory, delaydependent stabilization criteria are obtained in terms of linear matrix inequalities (LMIs) by constructing the new Lyapunov-Krasovskii functional and using some inequalities. A robust state feedback controller is designed by LMI framework and a reciprocally convex combination technique. The effectiveness of the proposed method is verified throughout a numerical example.

Keywords: Lur'e systems, Time-delay, Stabilization, LMIs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1667
4818 A Flipped Classroom Approach for Non-Science Majors

Authors: Nidhi Gadura

Abstract:

To ensure student success in a non-majors biology course, a flipped classroom pedagogical approach was developed and implemented. All students were assigned online lectures to listen to before they come to class. A three hour lecture was split into one hour of online component, one hour of in class lecture and one hour of worksheets done by students in the classroom. This deviation from a traditional 3 hour in class lecture has resulted in increased student interest in science as well as better understanding of difficult scientific concepts. A pre and post survey was given to measure the interest in the subject and grades were used to measure the success rates. While the overall grade average did not change dramatically, students reported a much better appreciation of biology. Also, students overwhelmingly like the use of worksheets in class to help them understand the concepts. They liked the fact that they could listen to lectures at their own pace on line and even repeat if needed. The flipped classroom approach turned out to work really well our non-science majors and the author is ready to implement this in other classrooms.

Keywords: Flipped classroom, non-science majors, pedagogy, technological pedagogical model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2040
4817 A New Method for Estimation of the Source Coherency Structure of Wideband Sources

Authors: Yong-jun Zhao, Heng-li Zhang, Zong-yun Hu

Abstract:

Based on the sources- smoothed rank profile (SRP) and modified minimum description length (MMDL) principle, a method for estimation of the source coherency structure (SCS) and the number of wideband sources is proposed in this paper. Instead of focusing, we first use a spatial smoothing technique to pre-process the array covariance matrix of each frequency for de-correlating the sources and then use smoothed rank profile to determine the SCS and the number of wideband sources. We demonstrate the availability of the method by numerical simulations.

Keywords: Wideband sources, source coherency structure (SCS), smoothed rank profile (SRP).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1299
4816 2D Bar Codes Reading: Solutions for Camera Phones

Authors: Hao Wang, Yanming Zou

Abstract:

Two-dimensional (2D) bar codes were designed to carry significantly more data with higher information density and robustness than its 1D counterpart. Thanks to the popular combination of cameras and mobile phones, it will naturally bring great commercial value to use the camera phone for 2D bar code reading. This paper addresses the problem of specific 2D bar code design for mobile phones and introduces a low-level encoding method of matrix codes. At the same time, we propose an efficient scheme for 2D bar codes decoding, of which the effort is put on solutions of the difficulties introduced by low image quality that is very common in bar code images taken by a phone camera.

Keywords: 2D bar code reading, camera phone, low-level encoding, mixed model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1823
4815 The Effect of Insurance on Foreign Direct Investments Inflow to Nigeria

Authors: Chimaobi V. Okolo, Afamefuna J. Ani, Ebere U. Okolo

Abstract:

This paper seeks to assess the implications of insurance to foreign direct investment inflow in Nigeria. Multiple linear regression technique and correlation matrix test were employed to measure the extent to which foreign direct investment was influenced. The result showed that insurance premium (IP), asset size of insurance industry (AS), and total investment of the industry (TI) impacted significantly and positively on foreign direct investment inflow in Nigeria. There should be effective risk transfer mechanism and financial intermediation, which gives the investor confidence in the risk management strength of the host country.

Keywords: Foreign direct investment, insurance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3020
4814 A Combined Approach of a Sequential Life Testing and an Accelerated Life Testing Applied to a Low-Alloy High Strength Steel Component

Authors: D. I. De Souza, D. R. Fonseca, G. P. Azevedo

Abstract:

Sometimes the amount of time available for testing could be considerably less than the expected lifetime of the component. To overcome such a problem, there is the accelerated life-testing alternative aimed at forcing components to fail by testing them at much higher-than-intended application conditions. These models are known as acceleration models. One possible way to translate test results obtained under accelerated conditions to normal using conditions could be through the application of the “Maxwell Distribution Law.” In this paper we will apply a combined approach of a sequential life testing and an accelerated life testing to a low alloy high-strength steel component used in the construction of overpasses in Brazil. The underlying sampling distribution will be three-parameter Inverse Weibull model. To estimate the three parameters of the Inverse Weibull model we will use a maximum likelihood approach for censored failure data. We will be assuming a linear acceleration condition. To evaluate the accuracy (significance) of the parameter values obtained under normal conditions for the underlying Inverse Weibull model we will apply to the expected normal failure times a sequential life testing using a truncation mechanism. An example will illustrate the application of this procedure.

Keywords: Sequential Life Testing, Accelerated Life Testing, Underlying Three-Parameter Weibull Model, Maximum Likelihood Approach, Hypothesis Testing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1622
4813 Identifying Factors Contributing to the Spread of Lyme Disease: A Regression Analysis of Virginia’s Data

Authors: Fatemeh Valizadeh Gamchi, Edward L. Boone

Abstract:

This research focuses on Lyme disease, a widespread infectious condition in the United States caused by the bacterium Borrelia burgdorferi sensu stricto. It is critical to identify environmental and economic elements that are contributing to the spread of the disease. This study examined data from Virginia to identify a subset of explanatory variables significant for Lyme disease case numbers. To identify relevant variables and avoid overfitting, linear poisson, and regularization regression methods such as ridge, lasso, and elastic net penalty were employed. Cross-validation was performed to acquire tuning parameters. The methods proposed can automatically identify relevant disease count covariates. The efficacy of the techniques was assessed using four criteria on three simulated datasets. Finally, using the Virginia Department of Health’s Lyme disease dataset, the study successfully identified key factors, and the results were consistent with previous studies.

Keywords: Lyme disease, Poisson generalized linear model, Ridge regression, Lasso Regression, elastic net regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 80
4812 Characteristic Function in Estimation of Probability Distribution Moments

Authors: Vladimir S. Timofeev

Abstract:

In this article the problem of distributional moments estimation is considered. The new approach of moments estimation based on usage of the characteristic function is proposed. By statistical simulation technique author shows that new approach has some robust properties. For calculation of the derivatives of characteristic function there is used numerical differentiation. Obtained results confirmed that author’s idea has a certain working efficiency and it can be recommended for any statistical applications.

Keywords: Characteristic function, distributional moments, robustness, outlier, statistical estimation problem, statistical simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2236
4811 Algorithms for the Fast Computation of PWL and PHL Transforms

Authors: Fituri H Belgassem, Abdulbasit Nigrat, Seddeeq Ghrari

Abstract:

In this paper, the construction of fast algorithms for the computation of Periodic Walsh Piecewise-Linear PWL transform and the Periodic Haar Piecewise-Linear PHL transform will be presented. Algorithms for the computation of the inverse transforms are also proposed. The matrix equation of the PWL and PHL transforms are introduced. Comparison of the computational requirements for the periodic piecewise-linear transforms and other orthogonal transforms shows that the periodic piecewise-linear transforms require less number of operations than some orthogonal transforms such as the Fourier, Walsh and the Discrete Cosine transforms.

Keywords: Piece wise linear transforms, Fast transforms, Fast algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1639
4810 Unit Commitment Solution Methods

Authors: Sayeed Salam

Abstract:

An effort to develop a unit commitment approach capable of handling large power systems consisting of both thermal and hydro generating units offers a large profitable return. In order to be feasible, the method to be developed must be flexible, efficient and reliable. In this paper, various proposed methods have been described along with their strengths and weaknesses. As all of these methods have some sort of weaknesses, a comprehensive algorithm that combines the strengths of different methods and overcomes each other-s weaknesses would be a suitable approach for solving industry-grade unit commitment problem.

Keywords: Unit commitment, Solution methods, and Comprehensive algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6140
4809 Heritability and Repeatability Estimates of Some Measurable Traits in Meat Type Chickens Reared for Ten Weeks in Abeokuta, Nigeria

Authors: A. J. Sanda, O. Olowofeso, M. A. Adeleke, A. O Oso, S. O. Durosaro, M. O. Sanda

Abstract:

A total of 150 meat type chickens comprising 50 each of Arbor Acre, Marshall and Ross were used for this study which lasted for 10 weeks at the Federal University of Agriculture, Abeokuta, Nigeria. Growth performance data were collected from the third week through week 10 and data obtained were analysed using the Generalized Linear Model Procedure. Heritability estimates (h2) for body dimensions carried out on the chicken strains ranged from low to high. Marshall broiler chicken strain had the highest h2 for body weight 0.46±0.04, followed by Arbor Acre and Ross with h2 being 0.38±0.12 and 0.26±0.06, respectively. The repeatability estimates for body weight in the three broiler strains were high, and it ranged from 0.70 at week 4 to 0.88 at week 10. Relationships between the body weight and linear body measurements in the broiler chicken strains were positive and highly significant (p > 0.05).

Keywords: Broiler chicken strains, heritability, repeatability, traits.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2959