Search results for: Francesco Carlo Morabito
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 195

Search results for: Francesco Carlo Morabito

45 First Studies of the Influence of Single Gene Perturbations on the Inference of Genetic Networks

Authors: Frank Emmert-Streib, Matthias Dehmer

Abstract:

Inferring the network structure from time series data is a hard problem, especially if the time series is short and noisy. DNA microarray is a technology allowing to monitor the mRNA concentration of thousands of genes simultaneously that produces data of these characteristics. In this study we try to investigate the influence of the experimental design on the quality of the result. More precisely, we investigate the influence of two different types of random single gene perturbations on the inference of genetic networks from time series data. To obtain an objective quality measure for this influence we simulate gene expression values with a biologically plausible model of a known network structure. Within this framework we study the influence of single gene knock-outs in opposite to linearly controlled expression for single genes on the quality of the infered network structure.

Keywords: Dynamic Bayesian networks, microarray data, structure learning, Markov chain Monte Carlo.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1549
44 A New Performance Characterization of Transient Analysis Method

Authors: José Peralta, Gabriela Peretti, Eduardo Romero, Carlos Marqués

Abstract:

This paper proposes a new performance characterization for the test strategy intended for second order filters denominated Transient Analysis Method (TRAM). We evaluate the ability of the addressed test strategy for detecting deviation faults under simultaneous statistical fluctuation of the non-faulty parameters. For this purpose, we use Monte Carlo simulations and a fault model that considers as faulty only one component of the filter under test while the others components adopt random values (within their tolerance band) obtained from their statistical distributions. The new data reported here show (for the filters under study) the presence of hard-to-test components and relatively low fault coverage values for small deviation faults. These results suggest that the fault coverage value obtained using only nominal values for the non-faulty components (the traditional evaluation of TRAM) seem to be a poor predictor of the test performance.

Keywords: testing, fault analysis, analog filter test, parametric faults detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1463
43 Environmental and Economic Scenario Analysis of the Redundant Golf Courses in Japan

Authors: Osamu Saito

Abstract:

Commercial infrastructures intended for use as leisure retreats such as golf and ski resorts have been extensively developed in many rural areas of Japan. However, following the burst of the economic bubble in the 1990s, several existing resorts faced tough management decisions and some were forced to close their business. In this study, six alternative management options for restructuring the existing golf courses (park, cemetery, biofuel production, reforestation, pasturing and abandonment) are examined and their environmental and economic impacts are quantitatively assessed. In addition, restructuring scenarios of these options and an ex-ante assessment model are developed. The scenario analysis by Monte Carlo simulation shows a clear trade-off between GHG savings and benefit/cost (B/C) ratios, of which “Restoring Nature" scenario absorbs the most CO2 among the four scenarios considered, but its B/C ratio is the lowest. This study can be used to select or examine options and scenarios of golf course management and rural environmental management policies.

Keywords: golf courses, restructuring and management options, scenario analysis, Tokyo Metropolitan Area.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1620
42 Efficient Tools for Managing Uncertainties in Design and Operation of Engineering Structures

Authors: J. Menčík

Abstract:

Actual load, material characteristics and other quantities often differ from the design values. This can cause worse function, shorter life or failure of a civil engineering structure, a machine, vehicle or another appliance. The paper shows main causes of the uncertainties and deviations and presents a systematic approach and efficient tools for their elimination or mitigation of consequences. Emphasis is put on the design stage, which is most important for reliability ensuring. Principles of robust design and important tools are explained, including FMEA, sensitivity analysis and probabilistic simulation methods. The lifetime prediction of long-life objects can be improved by long-term monitoring of the load response and damage accumulation in operation. The condition evaluation of engineering structures, such as bridges, is often based on visual inspection and verbal description. Here, methods based on fuzzy logic can reduce the subjective influences.

Keywords: Design, fuzzy methods, Monte Carlo, reliability, robust design, sensitivity analysis, simulation, uncertainties.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1814
41 Influence of Textured Clusters on the Goss Grains Growth in Silicon Steels Consideration of Energy and Mobility

Authors: H. Afer, N. Rouag, R. Penelle

Abstract:

In the Fe-3%Si sheets, grade Hi-B, with AlN and MnS as inhibitors, the Goss grains which abnormally grow do not have a size greater than the average size of the primary matrix. In this heterogeneous microstructure, the size factor is not a required condition for the secondary recrystallization. The onset of the small Goss grain abnormal growth appears to be related to a particular behavior of their grain boundaries, to the local texture and to the distribution of the inhibitors. The presence and the evolution of oriented clusters ensure to the small Goss grains a favorable neighborhood to grow. The modified Monte-Carlo approach, which is applied, considers the local environment of each grain. The grain growth is dependent of its real spatial position; the matrix heterogeneity is then taken into account. The grain growth conditions are considered in the global matrix and in different matrixes corresponding to A component clusters. The grain growth behaviour is considered with introduction of energy only, energy and mobility, energy and mobility and precipitates.

Keywords: Abnormal grain growth, grain boundary energy andmobility, neighbourhood, oriented clusters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1372
40 Statistical Description of Counterpoise Effective Length Based On Regressive Formulas

Authors: Petar Sarajcev, Josip Vasilj, Damir Jakus

Abstract:

This paper presents a novel statistical description of the counterpoise effective length due to lightning surges, where the (impulse) effective length had been obtained by means of regressive formulas applied to the transient simulation results. The effective length is described in terms of a statistical distribution function, from which median, mean, variance, and other parameters of interest could be readily obtained. The influence of lightning current amplitude, lightning front duration, and soil resistivity on the effective length has been accounted for, assuming statistical nature of these parameters. A method for determining the optimal counterpoise length, in terms of the statistical impulse effective length, is also presented. It is based on estimating the number of dangerous events associated with lightning strikes. Proposed statistical description and the associated method provide valuable information which could aid the design engineer in optimising physical lengths of counterpoises in different grounding arrangements and soil resistivity situations.

Keywords: Counterpoise, Grounding conductor, Effective length, Lightning, Monte Carlo method, Statistical distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2794
39 A Double Differential Chaos Shift Keying Scheme for Ultra-Wideband Chaotic Communication Technology Applied in Low-Rate Wireless Personal Area Network

Authors: Ghobad Gorji, Hasan Golabi

Abstract:

The goal of this paper is to describe the design of an ultra-wideband (UWB) system that is optimized for the low-rate wireless personal area network application. To this aim, we propose a system based on direct chaotic communication (DCC) technology. Based on this system, a 2-GHz wide chaotic signal is produced into the UWB spectrum lower band, i.e., 3.1–5.1 GHz. For this system, two simple modulation schemes, namely chaotic on-off keying (COOK) and differential chaos shift keying  (DCSK) are evaluated first. We propose a modulation scheme, namely Double DCSK, to improve the performance of UWB DCC. Different characteristics of these systems, with Monte Carlo simulations based on the Additive White Gaussian Noise (AWGN) and the IEEE 802.15.4a standard channel models, are compared.

Keywords: Ultra-wideband, UWB, Direct Chaotic Communication, DCC, IEEE 802.15.4a, COOK, DCSK.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 208
38 Idiopathic Constipation can be Subdivided in Clinical Subtypes: Data Mining by Cluster Analysis on a Population based Study

Authors: Mauro Giacomini, Stefania Bertone, Carlo Mansi, Pietro Dulbecco, Vincenzo Savarino

Abstract:

The prevalence of non organic constipation differs from country to country and the reliability of the estimate rates is uncertain. Moreover, the clinical relevance of subdividing the heterogeneous functional constipation disorders into pre-defined subgroups is largely unknown.. Aim: to estimate the prevalence of constipation in a population-based sample and determine whether clinical subgroups can be identified. An age and gender stratified sample population from 5 Italian cities was evaluated using a previously validated questionnaire. Data mining by cluster analysis was used to determine constipation subgroups. Results: 1,500 complete interviews were obtained from 2,083 contacted households (72%). Self-reported constipation correlated poorly with symptombased constipation found in 496 subjects (33.1%). Cluster analysis identified four constipation subgroups which correlated to subgroups identified according to pre-defined symptom criteria. Significant differences in socio-demographics and lifestyle were observed among subgroups.

Keywords: Cluster analysis, constipation, data mining, statistical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1293
37 Validation of Reverse Engineered Web Application Models

Authors: Carlo Bellettini, Alessandro Marchetto, Andrea Trentini

Abstract:

Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.

Keywords: Validation, Dynamic Analysis, MutationAnalysis, Reverse Engineering, Web Applications

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1622
36 Model Free Terminal Sliding Mode with Gravity Compensation: Application to an Exoskeleton-Upper Limb System

Authors: Sana Bembli, Nahla Khraief Haddad, Safya Belghith

Abstract:

This paper deals with a robust model free terminal sliding mode with gravity compensation approach used to control an exoskeleton-upper limb system. The considered system is a 2-DoF robot in interaction with an upper limb used for rehabilitation. The aim of this paper is to control the flexion/extension movement of the shoulder and the elbow joints in presence of matched disturbances. In the first part, we present the exoskeleton-upper limb system modeling. Then, we controlled the considered system by the model free terminal sliding mode with gravity compensation. A stability study is realized. To prove the controller performance, a robustness analysis was needed. Simulation results are provided to confirm the robustness of the gravity compensation combined with to the Model free terminal sliding mode in presence of uncertainties.

Keywords: Exoskeleton-upper limb system, gravity compensation, model free terminal sliding mode, robustness analysis, Monte Carlo, H∞ methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 737
35 A Forward Automatic Censored Cell-Averaging Detector for Multiple Target Situations in Log-Normal Clutter

Authors: Musa'ed N. Almarshad, Saleh A. Alshebeili, Mourad Barkat

Abstract:

A challenging problem in radar signal processing is to achieve reliable target detection in the presence of interferences. In this paper, we propose a novel algorithm for automatic censoring of radar interfering targets in log-normal clutter. The proposed algorithm, termed the forward automatic censored cell averaging detector (F-ACCAD), consists of two steps: removing the corrupted reference cells (censoring) and the actual detection. Both steps are performed dynamically by using a suitable set of ranked cells to estimate the unknown background level and set the adaptive thresholds accordingly. The F-ACCAD algorithm does not require any prior information about the clutter parameters nor does it require the number of interfering targets. The effectiveness of the F-ACCAD algorithm is assessed by computing, using Monte Carlo simulations, the probability of censoring and the probability of detection in different background environments.

Keywords: CFAR, Log-normal clutter, Censoring, Probabilityof detection, Probability of false alarm, Probability of falsecensoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1915
34 Bayesian Inference for Phase Unwrapping Using Conjugate Gradient Method in One and Two Dimensions

Authors: Yohei Saika, Hiroki Sakaematsu, Shota Akiyama

Abstract:

We investigated statistical performance of Bayesian inference using maximum entropy and MAP estimation for several models which approximated wave-fronts in remote sensing using SAR interferometry. Using Monte Carlo simulation for a set of wave-fronts generated by assumed true prior, we found that the method of maximum entropy realized the optimal performance around the Bayes-optimal conditions by using model of the true prior and the likelihood representing optical measurement due to the interferometer. Also, we found that the MAP estimation regarded as a deterministic limit of maximum entropy almost achieved the same performance as the Bayes-optimal solution for the set of wave-fronts. Then, we clarified that the MAP estimation perfectly carried out phase unwrapping without using prior information, and also that the MAP estimation realized accurate phase unwrapping using conjugate gradient (CG) method, if we assumed the model of the true prior appropriately.

Keywords: Bayesian inference using maximum entropy, MAP estimation using conjugate gradient method, SAR interferometry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1751
33 Maximizer of the Posterior Marginal Estimate for Noise Reduction of JPEG-compressed Image

Authors: Yohei Saika, Yuji Haraguchi

Abstract:

We constructed a method of noise reduction for JPEG-compressed image based on Bayesian inference using the maximizer of the posterior marginal (MPM) estimate. In this method, we tried the MPM estimate using two kinds of likelihood, both of which enhance grayscale images converted into the JPEG-compressed image through the lossy JPEG image compression. One is the deterministic model of the likelihood and the other is the probabilistic one expressed by the Gaussian distribution. Then, using the Monte Carlo simulation for grayscale images, such as the 256-grayscale standard image “Lena" with 256 × 256 pixels, we examined the performance of the MPM estimate based on the performance measure using the mean square error. We clarified that the MPM estimate via the Gaussian probabilistic model of the likelihood is effective for reducing noises, such as the blocking artifacts and the mosquito noise, if we set parameters appropriately. On the other hand, we found that the MPM estimate via the deterministic model of the likelihood is not effective for noise reduction due to the low acceptance ratio of the Metropolis algorithm.

Keywords: Noise reduction, JPEG-compressed image, Bayesian inference, the maximizer of the posterior marginal estimate

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1988
32 Structural Reliability of Existing Structures: A Case Study

Authors: Z. Sakka, I. Assakkaf, T. Al-Yaqoub, J. Parol

Abstract:

reliability-based methodology for the assessment and evaluation of reinforced concrete (R/C) structural elements of concrete structures is presented herein. The results of the reliability analysis and assessment for R/C structural elements were verified by the results obtained through deterministic methods. The outcomes of the reliability-based analysis were compared against currently adopted safety limits that are incorporated in the reliability indices β’s, according to international standards and codes. The methodology is based on probabilistic analysis using reliability concepts and statistics of the main random variables that are relevant to the subject matter, and for which they are to be used in the performance-function equation(s) associated with the structural elements under study. These methodology techniques can result in reliability index β, which is commonly known as the reliability index or reliability measure value that can be utilized to assess and evaluate the safety, human risk, and functionality of the structural component. Also, these methods can result in revised partial safety factor values for certain target reliability indices that can be used for the purpose of redesigning the R/C elements of the building and in which they could assist in considering some other remedial actions to improve the safety and functionality of the member.

Keywords: Concrete Structures, FORM, Monte Carlo Simulation, Structural Reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3091
31 Capability Prediction of Machining Processes Based on Uncertainty Analysis

Authors: Hamed Afrasiab, Saeed Khodaygan

Abstract:

Prediction of machining process capability in the design stage plays a key role to reach the precision design and manufacturing of mechanical products. Inaccuracies in machining process lead to errors in position and orientation of machined features on the part, and strongly affect the process capability in the final quality of the product. In this paper, an efficient systematic approach is given to investigate the machining errors to predict the manufacturing errors of the parts and capability prediction of corresponding machining processes. A mathematical formulation of fixture locators modeling is presented to establish the relationship between the part errors and the related sources. Based on this method, the final machining errors of the part can be accurately estimated by relating them to the combined dimensional and geometric tolerances of the workpiece – fixture system. This method is developed for uncertainty analysis based on the Worst Case and statistical approaches. The application of the presented method is illustrated through presenting an example and the computational results are compared with the Monte Carlo simulation results.

Keywords: Process capability, machining error, dimensional and geometrical tolerances, uncertainty analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1234
30 Dynamic Correlations and Portfolio Optimization between Islamic and Conventional Equity Indexes: A Vine Copula-Based Approach

Authors: Imen Dhaou

Abstract:

This study examines conditional Value at Risk by applying the GJR-EVT-Copula model, and finds the optimal portfolio for eight Dow Jones Islamic-conventional pairs. Our methodology consists of modeling the data by a bivariate GJR-GARCH model in which we extract the filtered residuals and then apply the Peak over threshold model (POT) to fit the residual tails in order to model marginal distributions. After that, we use pair-copula to find the optimal portfolio risk dependence structure. Finally, with Monte Carlo simulations, we estimate the Value at Risk (VaR) and the conditional Value at Risk (CVaR). The empirical results show the VaR and CVaR values for an equally weighted portfolio of Dow Jones Islamic-conventional pairs. In sum, we found that the optimal investment focuses on Islamic-conventional US Market index pairs because of high investment proportion; however, all other index pairs have low investment proportion. These results deliver some real repercussions for portfolio managers and policymakers concerning to optimal asset allocations, portfolio risk management and the diversification advantages of these markets.

Keywords: CVaR, Dow Jones Islamic index, GJR-GARCH-EVT-pair copula, portfolio optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 995
29 A Partially Accelerated Life Test Planning with Competing Risks and Linear Degradation Path under Tampered Failure Rate Model

Authors: Fariba Azizi, Firoozeh Haghighi, Viliam Makis

Abstract:

In this paper, we propose a method to model the relationship between failure time and degradation for a simple step stress test where underlying degradation path is linear and different causes of failure are possible. It is assumed that the intensity function depends only on the degradation value. No assumptions are made about the distribution of the failure times. A simple step-stress test is used to shorten failure time of products and a tampered failure rate (TFR) model is proposed to describe the effect of the changing stress on the intensities. We assume that some of the products that fail during the test have a cause of failure that is only known to belong to a certain subset of all possible failures. This case is known as masking. In the presence of masking, the maximum likelihood estimates (MLEs) of the model parameters are obtained through an expectation-maximization (EM) algorithm by treating the causes of failure as missing values. The effect of incomplete information on the estimation of parameters is studied through a Monte-Carlo simulation. Finally, a real example is analyzed to illustrate the application of the proposed methods.

Keywords: Expectation-maximization (EM) algorithm, cause of failure, intensity, linear degradation path, masked data, reliability function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1073
28 Uncertainty Propagation and Sensitivity Analysis During Calibration of an Integrated Land Use and Transport Model

Authors: Parikshit Dutta, Mathieu Saujot, Elise Arnaud, Benoit Lefevre, Emmanuel Prados

Abstract:

In this work, propagation of uncertainty during calibration process of TRANUS, an integrated land use and transport model (ILUTM), has been investigated. It has also been examined, through a sensitivity analysis, which input parameters affect the variation of the outputs the most. Moreover, a probabilistic verification methodology of calibration process, which equates the observed and calculated production, has been proposed. The model chosen as an application is the model of the city of Grenoble, France. For sensitivity analysis and uncertainty propagation, Monte Carlo method was employed, and a statistical hypothesis test was used for verification. The parameters of the induced demand function in TRANUS, were assumed as uncertain in the present case. It was found that, if during calibration, TRANUS converges, then with a high probability the calibration process is verified. Moreover, a weak correlation was found between the inputs and the outputs of the calibration process. The total effect of the inputs on outputs was investigated, and the output variation was found to be dictated by only a few input parameters.

Keywords: Uncertainty propagation, sensitivity analysis, calibration under uncertainty, hypothesis testing, integrated land use and transport models, TRANUS, Grenoble.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1521
27 Using Simulation Modeling Approach to Predict USMLE Steps 1 and 2 Performances

Authors: Chau-Kuang Chen, John Hughes, Jr., A. Dexter Samuels

Abstract:

The prediction models for the United States Medical Licensure Examination (USMLE) Steps 1 and 2 performances were constructed by the Monte Carlo simulation modeling approach via linear regression. The purpose of this study was to build robust simulation models to accurately identify the most important predictors and yield the valid range estimations of the Steps 1 and 2 scores. The application of simulation modeling approach was deemed an effective way in predicting student performances on licensure examinations. Also, sensitivity analysis (a/k/a what-if analysis) in the simulation models was used to predict the magnitudes of Steps 1 and 2 affected by changes in the National Board of Medical Examiners (NBME) Basic Science Subject Board scores. In addition, the study results indicated that the Medical College Admission Test (MCAT) Verbal Reasoning score and Step 1 score were significant predictors of the Step 2 performance. Hence, institutions could screen qualified student applicants for interviews and document the effectiveness of basic science education program based on the simulation results.

Keywords: Prediction Model, Sensitivity Analysis, Simulation Method, USMLE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1461
26 Estimated Production Potential Types of Wind Turbines Connected to the Network Using Random Numbers Simulation

Authors: Saeid Nahi, Seyed Mohammad Hossein Nabavi

Abstract:

Nowadays, power systems, energy generation by wind has been very important. Noting that the production of electrical energy by wind turbines on site to several factors (such as wind speed and profile site for the turbines, especially off the wind input speed, wind rated speed and wind output speed disconnect) is dependent. On the other hand, several different types of turbines in the market there. Therefore, selecting a turbine that its capacity could also answer the need for electric consumers the efficiency is high something is important and necessary. In this context, calculating the amount of wind power to help optimize overall network, system operation, in determining the parameters of wind power is very important. In this article, to help calculate the amount of wind power plant, connected to the national network in the region Manjil wind, selecting the best type of turbine and power delivery profile appropriate to the network using Monte Carlo method has been. In this paper, wind speed data from the wind site in Manjil, as minute and during the year has been. Necessary simulations based on Random Numbers Simulation method and repeat, using the software MATLAB and Excel has been done.

Keywords: wind turbine, efficiency, wind turbine work points, Random Numbers, reliability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1410
25 Statically Fused Unbiased Converted Measurements Kalman Filter

Authors: Zhengkun Guo, Yanbin Li, Wenqing Wang, Bo Zou

Abstract:

Active radar and sonar systems often report Doppler measurements in addition to the position measurements such as range and bearing. The tracker can perform better by making full use of the Doppler measurements. However, due to the high nonlinearity of the Doppler measurements with respect to the target state in the Cartesian coordinate systems, those measurements are not always fully exploited. This paper mainly focuses on dealing with the Doppler measurements as well as the position measurements in Polar coordinates. The Statically Fused Converted Position and Doppler Measurements Kalman Filter (SF-CMKF) with additive debiased measurement conversion has been presented. However, the exact compensation for the bias of the measurement conversion are multiplicative and depend on the statistics of the cosine of the angle measurement errors. As a result, the consistency and performance of the SF-CMKF may be suboptimal in the large angle error situations. In this paper, the multiplicative unbiased position and Doppler measurement conversion for two-dimensional (Polar-to-Cartesian) tracking are derived, and the SF-CMKF is improved by using those conversion. Monte Carlo simulations are presented to demonstrate the statistic consistency of the multiplicative unbiased conversion and the superior performance of the modified SF-CMKF (SF-UCMKF).

Keywords: Measurement conversion, Doppler, Kalman filter, estimation, tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 375
24 Reliability Based Performance Evaluation of Stone Column Improved Soft Ground

Authors: A. GuhaRay, C. V. S. P. Kiranmayi, S. Rudraraju

Abstract:

The present study considers the effect of variation of different geotechnical random variables in the design of stone column-foundation systems for assessing the bearing capacity and consolidation settlement of highly compressible soil. The soil and stone column properties, spacing, diameter and arrangement of stone columns are considered as the random variables. Probability of failure (Pf) is computed for a target degree of consolidation and a target safe load by Monte Carlo Simulation (MCS). The study shows that the variation in coefficient of radial consolidation (cr) and cohesion of soil (cs) are two most important factors influencing Pf. If the coefficient of variation (COV) of cr exceeds 20%, Pf exceeds 0.001, which is unsafe following the guidelines of US Army Corps of Engineers. The bearing capacity also exceeds its safe value for COV of cs > 30%. It is also observed that as the spacing between the stone column increases, the probability of reaching a target degree of consolidation decreases. Accordingly, design guidelines, considering both consolidation and bearing capacity of improved ground, are proposed for different spacing and diameter of stone columns and geotechnical random variables.

Keywords: Bearing capacity, consolidation, geotechnical random variables, probability of failure, stone columns.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1177
23 Effect of Size of the Step in the Response Surface Methodology using Nonlinear Test Functions

Authors: Jesús Everardo Olguín Tiznado, Rafael García Martínez, Claudia Camargo Wilson, Juan Andrés López Barreras, Everardo Inzunza González, Javier Ordorica Villalvazo

Abstract:

The response surface methodology (RSM) is a collection of mathematical and statistical techniques useful in the modeling and analysis of problems in which the dependent variable receives the influence of several independent variables, in order to determine which are the conditions under which should operate these variables to optimize a production process. The RSM estimated a regression model of first order, and sets the search direction using the method of maximum / minimum slope up / down MMS U/D. However, this method selects the step size intuitively, which can affect the efficiency of the RSM. This paper assesses how the step size affects the efficiency of this methodology. The numerical examples are carried out through Monte Carlo experiments, evaluating three response variables: efficiency gain function, the optimum distance and the number of iterations. The results in the simulation experiments showed that in response variables efficiency and gain function at the optimum distance were not affected by the step size, while the number of iterations is found that the efficiency if it is affected by the size of the step and function type of test used.

Keywords: RSM, dependent variable, independent variables, efficiency, simulation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1989
22 Effects of Livestream Affordances on Consumer Purchase Willingness: Explicit IT Affordances Perspective

Authors: Isaac O. Asante, Yushi Jiang, Hailin Tao

Abstract:

Livestreaming marketing, the new electronic commerce element, has become an optional marketing channel following the COVID-19 pandemic, and many sellers are leveraging the features presented by livestreaming to increase sales. This study was conducted to measure real-time observable interactions between consumers and sellers. Based on the affordance theory, this study conceptualized constructs representing the interactive features and examined how they drive consumers’ purchase willingness during livestreaming sessions using 1238 datasets from Amazon Live, following the manual observation of transaction records. Using structural equation modeling, the ordinary least square regression suggests that live viewers, new followers, live chats, and likes positively affect purchase willingness. The Sobel and Monte Carlo tests show that new followers, live chats, and likes significantly mediate the relationship between live viewers and purchase willingness. The study presents a way of measuring interactions in livestreaming commerce and proposes a way to manually gather data on consumer behaviors in livestreaming platforms when the application programming interface (API) of such platforms does not support data mining algorithms.

Keywords: Livestreaming marketing, live chats, live viewers, likes, new followers, purchase willingness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 148
21 Effect of Tube Thickness on the Face Bending for Blind-Bolted Connection to Concrete Filled Tubular Structures

Authors: Mohammed Mahmood, Walid Tizani, Carlo Sansour

Abstract:

In this paper, experimental testing and numerical analysis were used to investigate the effect of tube thickness on the face bending for concrete filled hollow sections connected to other structural members using Extended Hollobolts. Six samples were tested experimentally by applying pull-out load on the bolts. These samples were designed to fail by column face bending. The main variable in all tests is the column face thickness. Finite element analyses were also performed using ABAQUS 6.11 to extend the experimental results and to quantify the effect of column face thickness. Results show that, the column face thickness has a clear impact on the connection strength and stiffness. However, the amount of improvement in the connection stiffness by changing the column face thickness from 5mm to 6.3mm seems to be higher than that when increasing it from 6.3mm to 8mm. The displacement at which the bolts start pulling-out from their holes increased with the use of thinner column face due to the high flexibility of the section. At the ultimate strength, the yielding of the column face propagated to the column corner and there was no yielding in its walls. After the ultimate resistance is reached, the propagation of the yielding was mainly in the column face with a miner yielding in the walls.

Keywords: Anchored bolted connection, Extended Hollobolt, Column faces bending and concrete filled hollow sections.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2551
20 Diagnosing the Cause and its Timing of Changes in Multivariate Process Mean Vector from Quality Control Charts using Artificial Neural Network

Authors: Farzaneh Ahmadzadeh

Abstract:

Quality control charts are very effective in detecting out of control signals but when a control chart signals an out of control condition of the process mean, searching for a special cause in the vicinity of the signal time would not always lead to prompt identification of the source(s) of the out of control condition as the change point in the process parameter(s) is usually different from the signal time. It is very important to manufacturer to determine at what point and which parameters in the past caused the signal. Early warning of process change would expedite the search for the special causes and enhance quality at lower cost. In this paper the quality variables under investigation are assumed to follow a multivariate normal distribution with known means and variance-covariance matrix and the process means after one step change remain at the new level until the special cause is being identified and removed, also it is supposed that only one variable could be changed at the same time. This research applies artificial neural network (ANN) to identify the time the change occurred and the parameter which caused the change or shift. The performance of the approach was assessed through a computer simulation experiment. The results show that neural network performs effectively and equally well for the whole shift magnitude which has been considered.

Keywords: Artificial neural network, change point estimation, monte carlo simulation, multivariate exponentially weighted movingaverage

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1376
19 Probability-Based Damage Detection of Structures Using Kriging Surrogates and Enhanced Ideal Gas Molecular Movement Algorithm

Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee

Abstract:

Surrogate model has received increasing attention for use in detecting damage of structures based on vibration modal parameters. However, uncertainties existing in the measured vibration data may lead to false or unreliable output result from such model. In this study, an efficient approach based on Monte Carlo simulation is proposed to take into account the effect of uncertainties in developing a surrogate model. The probability of damage existence (PDE) is calculated based on the probability density function of the existence of undamaged and damaged states. The kriging technique allows one to genuinely quantify the surrogate error, therefore it is chosen as metamodeling technique. Enhanced version of ideal gas molecular movement (EIGMM) algorithm is used as main algorithm for model updating. The developed approach is applied to detect simulated damage in numerical models of 72-bar space truss and 120-bar dome truss. The simulation results show the proposed method can perform well in probability-based damage detection of structures with less computational effort compared to direct finite element model.

Keywords: Enhanced ideal gas molecular movement, Kriging, probability-based damage detection, probability of damage existence, surrogate modeling, uncertainty quantification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 948
18 A Methodology to Virtualize Technical Engineering Laboratories: MastrLAB-VR

Authors: Ivana Scidà, Francesco Alotto, Anna Osello

Abstract:

Due to the importance given today to innovation, the education sector is evolving thanks digital technologies. Virtual Reality (VR) can be a potential teaching tool offering many advantages in the field of training and education, as it allows to acquire theoretical knowledge and practical skills using an immersive experience in less time than the traditional educational process. These assumptions allow to lay the foundations for a new educational environment, involving and stimulating for students. Starting from the objective of strengthening the innovative teaching offer and the learning processes, the case study of the research concerns the digitalization of MastrLAB, High Quality Laboratory (HQL) belonging to the Department of Structural, Building and Geotechnical Engineering (DISEG) of the Polytechnic of Turin, a center specialized in experimental mechanical tests on traditional and innovative building materials and on the structures made with them. The MastrLAB-VR has been developed, a revolutionary innovative training tool designed with the aim of educating the class in total safety on the techniques of use of machinery, thus reducing the dangers arising from the performance of potentially dangerous activities. The virtual laboratory, dedicated to the students of the Building and Civil Engineering Courses of the Polytechnic of Turin, has been projected to simulate in an absolutely realistic way the experimental approach to the structural tests foreseen in their courses of study: from the tensile tests to the relaxation tests, from the steel qualification tests to the resilience tests on elements at environmental conditions or at characterizing temperatures. The research work proposes a methodology for the virtualization of technical laboratories through the application of Building Information Modelling (BIM), starting from the creation of a digital model. The process includes the creation of an independent application, which with Oculus Rift technology will allow the user to explore the environment and interact with objects through the use of joypads. The application has been tested in prototype way on volunteers, obtaining results related to the acquisition of the educational notions exposed in the experience through a virtual quiz with multiple answers, achieving an overall evaluation report. The results have shown that MastrLAB-VR is suitable for both beginners and experts and will be adopted experimentally for other laboratories of the University departments.

Keywords: Building Information Modelling, digital learning, education, virtual laboratory, virtual reality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 841
17 Suppression of Narrowband Interference in Impulse Radio Based High Data Rate UWB WPAN Communication System Using NLOS Channel Model

Authors: Bikramaditya Das, Susmita Das

Abstract:

Study on suppression of interference in time domain equalizers is attempted for high data rate impulse radio (IR) ultra wideband communication system. The narrow band systems may cause interference with UWB devices as it is having very low transmission power and the large bandwidth. SRAKE receiver improves system performance by equalizing signals from different paths. This enables the use of SRAKE receiver techniques in IRUWB systems. But Rake receiver alone fails to suppress narrowband interference (NBI). A hybrid SRake-MMSE time domain equalizer is proposed to overcome this by taking into account both the effect of the number of rake fingers and equalizer taps. It also combats intersymbol interference. A semi analytical approach and Monte-Carlo simulation are used to investigate the BER performance of SRAKEMMSE receiver on IEEE 802.15.3a UWB channel models. Study on non-line of sight indoor channel models (both CM3 and CM4) illustrates that bit error rate performance of SRake-MMSE receiver with NBI performs better than that of Rake receiver without NBI. We show that for a MMSE equalizer operating at high SNR-s the number of equalizer taps plays a more significant role in suppressing interference.

Keywords: IR-UWB, UWB, IEEE 802.15.3a, NBI, data rate, bit error rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1690
16 A Case Study on the Numerical-Probability Approach for Deep Excavation Analysis

Authors: Komeil Valipourian

Abstract:

Urban advances and the growing need for developing infrastructures has increased the importance of deep excavations. In this study, after the introducing probability analysis as an important issue, an attempt has been made to apply it for the deep excavation project of Bangkok’s Metro as a case study. For this, the numerical probability model has been developed based on the Finite Difference Method and Monte Carlo sampling approach. The results indicate that disregarding the issue of probability in this project will result in an inappropriate design of the retaining structure. Therefore, probabilistic redesign of the support is proposed and carried out as one of the applications of probability analysis. A 50% reduction in the flexural strength of the structure increases the failure probability just by 8% in the allowable range and helps improve economic conditions, while maintaining mechanical efficiency. With regard to the lack of efficient design in most deep excavations, by considering geometrical and geotechnical variability, an attempt was made to develop an optimum practical design standard for deep excavations based on failure probability. On this basis, a practical relationship is presented for estimating the maximum allowable horizontal displacement, which can help improve design conditions without developing the probability analysis.

Keywords: Numerical probability modeling, deep excavation, allowable maximum displacement, finite difference method, FDM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 692