Search results for: λ-levelwise statistical convergence
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4388

Search results for: λ-levelwise statistical convergence

4148 Sensor Registration in Multi-Static Sonar Fusion Detection

Authors: Longxiang Guo, Haoyan Hao, Xueli Sheng, Hanjun Yu, Jingwei Yin

Abstract:

In order to prevent target splitting and ensure the accuracy of fusion, system error registration is an important step in multi-static sonar fusion detection system. To eliminate the inherent system errors including distance error and angle error of each sonar in detection, this paper uses offline estimation method for error registration. Suppose several sonars from different platforms work together to detect a target. The target position detected by each sonar is based on each sonar’s own reference coordinate system. Based on the two-dimensional stereo projection method, this paper uses real-time quality control (RTQC) method and least squares (LS) method to estimate sensor biases. The RTQC method takes the average value of each sonar’s data as the observation value and the LS method makes the least square processing of each sonar’s data to get the observation value. In the underwater acoustic environment, matlab simulation is carried out and the simulation results show that both algorithms can estimate the distance and angle error of sonar system. The performance of the two algorithms is also compared through the root mean square error and the influence of measurement noise on registration accuracy is explored by simulation. The system error convergence of RTQC method is rapid, but the distribution of targets has a serious impact on its performance. LS method can not be affected by target distribution, but the increase of random noise will slow down the convergence rate. LS method is an improvement of RTQC method, which is widely used in two-dimensional registration. The improved method can be used for underwater multi-target detection registration.

Keywords: data fusion, multi-static sonar detection, offline estimation, sensor registration problem

Procedia PDF Downloads 136
4147 Statistical Shape Analysis of the Human Upper Airway

Authors: Ramkumar Gunasekaran, John Cater, Vinod Suresh, Haribalan Kumar

Abstract:

The main objective of this project is to develop a statistical shape model using principal component analysis that could be used for analyzing the shape of the human airway. The ultimate goal of this project is to identify geometric risk factors for diagnosis and management of Obstructive Sleep Apnoea (OSA). Anonymous CBCT scans of 25 individuals were obtained from the Otago Radiology Group. The airways were segmented between the hard-palate and the aryepiglottic fold using snake active contour segmentation. The point data cloud of the segmented images was then fitted with a bi-cubic mesh, and pseudo landmarks were placed to perform PCA on the segmented airway to analyze the shape of the airway and to find the relationship between the shape and OSA risk factors. From the PCA results, the first four modes of variation were found to be significant. Mode 1 was interpreted to be the overall length of the airway, Mode 2 was related to the anterior-posterior width of the retroglossal region, Mode 3 was related to the lateral dimension of the oropharyngeal region and Mode 4 was related to the anterior-posterior width of the oropharyngeal region. All these regions are subjected to the risk factors of OSA.

Keywords: medical imaging, image processing, FEM/BEM, statistical modelling

Procedia PDF Downloads 479
4146 Hypersonic Flow of CO2-N2 Mixture around a Spacecraft during the Atmospheric Reentry

Authors: Zineddine Bouyahiaoui, Rabah Haoui

Abstract:

The aim of this work is to analyze a flow around the axisymmetric blunt body taken into account the chemical and vibrational nonequilibrium flow. This work concerns the entry of spacecraft in the atmosphere of the planet Mars. Since the equations involved are non-linear partial derivatives, the volume method is the only way to solve this problem. The choice of the mesh and the CFL is a condition for the convergence to have the stationary solution.

Keywords: blunt body, finite volume, hypersonic flow, viscous flow

Procedia PDF Downloads 208
4145 Quantum Statistical Machine Learning and Quantum Time Series

Authors: Omar Alzeley, Sergey Utev

Abstract:

Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.

Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series

Procedia PDF Downloads 431
4144 A Brief Study about Nonparametric Adherence Tests

Authors: Vinicius R. Domingues, Luan C. S. M. Ozelim

Abstract:

The statistical study has become indispensable for various fields of knowledge. Not any different, in Geotechnics the study of probabilistic and statistical methods has gained power considering its use in characterizing the uncertainties inherent in soil properties. One of the situations where engineers are constantly faced is the definition of a probability distribution that represents significantly the sampled data. To be able to discard bad distributions, goodness-of-fit tests are necessary. In this paper, three non-parametric goodness-of-fit tests are applied to a data set computationally generated to test the goodness-of-fit of them to a series of known distributions. It is shown that the use of normal distribution does not always provide satisfactory results regarding physical and behavioral representation of the modeled parameters.

Keywords: Kolmogorov-Smirnov test, Anderson-Darling test, Cramer-Von-Mises test, nonparametric adherence tests

Procedia PDF Downloads 414
4143 English Language Proficiency and Use as Determinants of Transactional Success in Gbagi Market, Ibadan, Nigeria

Authors: A. Robbin

Abstract:

Language selection can be an efficient negotiation strategy employed by both service or product providers and their customers to achieve transactional success. The transactional scenario in Gbagi Market, Ibadan, Nigeria provides an appropriate setting for the exploration of the Nigerian multilingual situation with its own interesting linguistic peculiarities which questions the functionality of the ‘Lingua Franca’ in trade situations. This study examined English Language proficiency among Yoruba Traders in Gbagi Market, Ibadan and its use as determinants of transactional success during service encounters. Randomly selected Yoruba-English bilingual traders and customers were administered questionnaires and the data subjected to statistical and descriptive analysis using Giles Communication Accommodation Theory. Findings reveal that only fifty percent of the traders used for the study were proficient in speaking English language. Traders with minimal proficiency in Standard English, however, resulted in the use of the Nigerian Pidgin English. Both traders and customers select the Mother Tongue, which is the Yoruba Language during service encounters but are quick to converge to the other’s preferred language as the transactional exchange demands. The English language selection is not so much for the prestige or lingua franca status of the language as it is for its functions, which include ease of communication, negotiation, and increased sales. The use of English during service encounters is mostly determined by customer’s linguistic preference which the trader accommodates to for better negotiation and never as a first choice. This convergence is found to be beneficial as it ensures sales and return patronage. Although the English language is not a preferred code choice in Gbagi Market, it serves a functional trade strategy for transactional success during service encounters in the market.

Keywords: communication accommodation theory, language selection, proficiency, service encounter, transaction

Procedia PDF Downloads 125
4142 Wavelet-Based Classification of Myocardial Ischemia, Arrhythmia, Congestive Heart Failure and Sleep Apnea

Authors: Santanu Chattopadhyay, Gautam Sarkar, Arabinda Das

Abstract:

This paper presents wavelet based classification of various heart diseases. Electrocardiogram signals of different heart patients have been studied. Statistical natures of electrocardiogram signals for different heart diseases have been compared with the statistical nature of electrocardiograms for normal persons. Under this study four different heart diseases have been considered as follows: Myocardial Ischemia (MI), Congestive Heart Failure (CHF), Arrhythmia and Sleep Apnea. Statistical nature of electrocardiograms for each case has been considered in terms of kurtosis values of two types of wavelet coefficients: approximate and detail. Nine wavelet decomposition levels have been considered in each case. Kurtosis corresponding to both approximate and detail coefficients has been considered for decomposition level one to decomposition level nine. Based on significant difference, few decomposition levels have been chosen and then used for classification.

Keywords: arrhythmia, congestive heart failure, discrete wavelet transform, electrocardiogram, myocardial ischemia, sleep apnea

Procedia PDF Downloads 103
4141 Second Order Statistics of Dynamic Response of Structures Using Gamma Distributed Damping Parameters

Authors: Badreddine Chemali, Boualem Tiliouine

Abstract:

This article presents the main results of a numerical investigation on the uncertainty of dynamic response of structures with statistically correlated random damping Gamma distributed. A computational method based on a Linear Statistical Model (LSM) is implemented to predict second order statistics for the response of a typical industrial building structure. The significance of random damping with correlated parameters and its implications on the sensitivity of structural peak response in the neighborhood of a resonant frequency are discussed in light of considerable ranges of damping uncertainties and correlation coefficients. The results are compared to those generated using Monte Carlo simulation techniques. The numerical results obtained show the importance of damping uncertainty and statistical correlation of damping coefficients when obtaining accurate probabilistic estimates of dynamic response of structures. Furthermore, the effectiveness of the LSM model to efficiently predict uncertainty propagation for structural dynamic problems with correlated damping parameters is demonstrated.

Keywords: correlated random damping, linear statistical model, Monte Carlo simulation, uncertainty of dynamic response

Procedia PDF Downloads 242
4140 Irrigation Water Quality Evaluation Based on Multivariate Statistical Analysis: A Case Study of Jiaokou Irrigation District

Authors: Panpan Xu, Qiying Zhang, Hui Qian

Abstract:

Groundwater is main source of water supply in the Guanzhong Basin, China. To investigate the quality of groundwater for agricultural purposes in Jiaokou Irrigation District located in the east of the Guanzhong Basin, 141 groundwater samples were collected for analysis of major ions (K+, Na+, Mg2+, Ca2+, SO42-, Cl-, HCO3-, and CO32-), pH, and total dissolved solids (TDS). Sodium percentage (Na%), residual sodium carbonate (RSC), magnesium hazard (MH), and potential salinity (PS) were applied for irrigation water quality assessment. In addition, multivariate statistical techniques were used to identify the underlying hydrogeochemical processes. Results show that the content of TDS mainly depends on Cl-, Na+, Mg2+, and SO42-, and the HCO3- content is generally high except for the eastern sand area. These are responsible for complex hydrogeochemical processes, such as dissolution of carbonate minerals (dolomite and calcite), gypsum, halite, and silicate minerals, the cation exchange, as well as evaporation and concentration. The average evaluation levels of Na%, RSC, MH, and PS for irrigation water quality are doubtful, good, unsuitable, and injurious to unsatisfactory, respectively. Therefore, it is necessary for decision makers to comprehensively consider the indicators and thus reasonably evaluate the irrigation water quality.

Keywords: irrigation water quality, multivariate statistical analysis, groundwater, hydrogeochemical process

Procedia PDF Downloads 116
4139 The Profit Trend of Cosmetics Products Using Bootstrap Edgeworth Approximation

Authors: Edlira Donefski, Lorenc Ekonomi, Tina Donefski

Abstract:

Edgeworth approximation is one of the most important statistical methods that has a considered contribution in the reduction of the sum of standard deviation of the independent variables’ coefficients in a Quantile Regression Model. This model estimates the conditional median or other quantiles. In this paper, we have applied approximating statistical methods in an economical problem. We have created and generated a quantile regression model to see how the profit gained is connected with the realized sales of the cosmetic products in a real data, taken from a local business. The Linear Regression of the generated profit and the realized sales was not free of autocorrelation and heteroscedasticity, so this is the reason that we have used this model instead of Linear Regression. Our aim is to analyze in more details the relation between the variables taken into study: the profit and the finalized sales and how to minimize the standard errors of the independent variable involved in this study, the level of realized sales. The statistical methods that we have applied in our work are Edgeworth Approximation for Independent and Identical distributed (IID) cases, Bootstrap version of the Model and the Edgeworth approximation for Bootstrap Quantile Regression Model. The graphics and the results that we have presented here identify the best approximating model of our study.

Keywords: bootstrap, edgeworth approximation, IID, quantile

Procedia PDF Downloads 127
4138 Introduction of Robust Multivariate Process Capability Indices

Authors: Behrooz Khalilloo, Hamid Shahriari, Emad Roghanian

Abstract:

Process capability indices (PCIs) are important concepts of statistical quality control and measure the capability of processes and how much processes are meeting certain specifications. An important issue in statistical quality control is parameter estimation. Under the assumption of multivariate normality, the distribution parameters, mean vector and variance-covariance matrix must be estimated, when they are unknown. Classic estimation methods like method of moment estimation (MME) or maximum likelihood estimation (MLE) makes good estimation of the population parameters when data are not contaminated. But when outliers exist in the data, MME and MLE make weak estimators of the population parameters. So we need some estimators which have good estimation in the presence of outliers. In this work robust M-estimators for estimating these parameters are used and based on robust parameter estimators, robust process capability indices are introduced. The performances of these robust estimators in the presence of outliers and their effects on process capability indices are evaluated by real and simulated multivariate data. The results indicate that the proposed robust capability indices perform much better than the existing process capability indices.

Keywords: multivariate process capability indices, robust M-estimator, outlier, multivariate quality control, statistical quality control

Procedia PDF Downloads 250
4137 Strategic Investment in Infrastructure Development to Facilitate Economic Growth in the United States

Authors: Arkaprabha Bhattacharyya, Makarand Hastak

Abstract:

The COVID-19 pandemic is unprecedented in terms of its global reach and economic impacts. Historically, investment in infrastructure development projects has been touted to boost the economic growth of a nation. The State and Local governments responsible for delivering infrastructure assets work under tight budgets. Therefore, it is important to understand which infrastructure projects have the highest potential of boosting economic growth in the post-pandemic era. This paper presents relationships between infrastructure projects and economic growth. Statistical relationships between investment in different types of infrastructure projects (transit, water and wastewater, highways, power, manufacturing etc.) and indicators of economic growth are presented using historic data between 2002 and 2020 from the U.S. Census Bureau and U.S. Bureau of Economic Analysis (BEA). The outcome of the paper is the comparison of statistical correlations between investment in different types of infrastructure projects and indicators of economic growth. The comparison of the statistical correlations is useful in ranking the types of infrastructure projects based on their ability to influence economic prosperity. Therefore, investment in the infrastructures with the higher rank will have a better chance of boosting the economic growth. Once, the ranks are derived, they can be used by the decision-makers in infrastructure investment related decision-making process.

Keywords: economic growth, infrastructure development, infrastructure projects, strategic investment

Procedia PDF Downloads 142
4136 Statistical Assessment of Models for Determination of Soil–Water Characteristic Curves of Sand Soils

Authors: S. J. Matlan, M. Mukhlisin, M. R. Taha

Abstract:

Characterization of the engineering behavior of unsaturated soil is dependent on the soil-water characteristic curve (SWCC), a graphical representation of the relationship between water content or degree of saturation and soil suction. A reasonable description of the SWCC is thus important for the accurate prediction of unsaturated soil parameters. The measurement procedures for determining the SWCC, however, are difficult, expensive, and time-consuming. During the past few decades, researchers have laid a major focus on developing empirical equations for predicting the SWCC, with a large number of empirical models suggested. One of the most crucial questions is how precisely existing equations can represent the SWCC. As different models have different ranges of capability, it is essential to evaluate the precision of the SWCC models used for each particular soil type for better SWCC estimation. It is expected that better estimation of SWCC would be achieved via a thorough statistical analysis of its distribution within a particular soil class. With this in view, a statistical analysis was conducted in order to evaluate the reliability of the SWCC prediction models against laboratory measurement. Optimization techniques were used to obtain the best-fit of the model parameters in four forms of SWCC equation, using laboratory data for relatively coarse-textured (i.e., sandy) soil. The four most prominent SWCCs were evaluated and computed for each sample. The result shows that the Brooks and Corey model is the most consistent in describing the SWCC for sand soil type. The Brooks and Corey model prediction also exhibit compatibility with samples ranging from low to high soil water content in which subjected to the samples that evaluated in this study.

Keywords: soil-water characteristic curve (SWCC), statistical analysis, unsaturated soil, geotechnical engineering

Procedia PDF Downloads 313
4135 R Statistical Software Applied in Reliability Analysis: Case Study of Diesel Generator Fans

Authors: Jelena Vucicevic

Abstract:

Reliability analysis represents a very important task in different areas of work. In any industry, this is crucial for maintenance, efficiency, safety and monetary costs. There are ways to calculate reliability, unreliability, failure density and failure rate. This paper will try to introduce another way of calculating reliability by using R statistical software. R is a free software environment for statistical computing and graphics. It compiles and runs on a wide variety of UNIX platforms, Windows and MacOS. The R programming environment is a widely used open source system for statistical analysis and statistical programming. It includes thousands of functions for the implementation of both standard and new statistical methods. R does not limit user only to operation related only to these functions. This program has many benefits over other similar programs: it is free and, as an open source, constantly updated; it has built-in help system; the R language is easy to extend with user-written functions. The significance of the work is calculation of time to failure or reliability in a new way, using statistic. Another advantage of this calculation is that there is no need for technical details and it can be implemented in any part for which we need to know time to fail in order to have appropriate maintenance, but also to maximize usage and minimize costs. In this case, calculations have been made on diesel generator fans but the same principle can be applied to any other part. The data for this paper came from a field engineering study of the time to failure of diesel generator fans. The ultimate goal was to decide whether or not to replace the working fans with a higher quality fan to prevent future failures. Seventy generators were studied. For each one, the number of hours of running time from its first being put into service until fan failure or until the end of the study (whichever came first) was recorded. Dataset consists of two variables: hours and status. Hours show the time of each fan working and status shows the event: 1- failed, 0- censored data. Censored data represent cases when we cannot track the specific case, so it could fail or success. Gaining the result by using R was easy and quick. The program will take into consideration censored data and include this into the results. This is not so easy in hand calculation. For the purpose of the paper results from R program have been compared to hand calculations in two different cases: censored data taken as a failure and censored data taken as a success. In all three cases, results are significantly different. If user decides to use the R for further calculations, it will give more precise results with work on censored data than the hand calculation.

Keywords: censored data, R statistical software, reliability analysis, time to failure

Procedia PDF Downloads 376
4134 Pattern Identification in Statistical Process Control Using Artificial Neural Networks

Authors: M. Pramila Devi, N. V. N. Indra Kiran

Abstract:

Control charts, predominantly in the form of X-bar chart, are important tools in statistical process control (SPC). They are useful in determining whether a process is behaving as intended or there are some unnatural causes of variation. A process is out of control if a point falls outside the control limits or a series of point’s exhibit an unnatural pattern. In this paper, a study is carried out on four training algorithms for CCPs recognition. For those algorithms optimal structure is identified and then they are studied for type I and type II errors for generalization without early stopping and with early stopping and the best one is proposed.

Keywords: control chart pattern recognition, neural network, backpropagation, generalization, early stopping

Procedia PDF Downloads 337
4133 Chemical Variability in the Essential Oils from the Leaves and Buds of Syzygium Species

Authors: Rabia Waseem, Low Kah Hin, Najihah Mohamed Hashim

Abstract:

The variability in the chemical components of the Syzygium species essential oils has been evaluated. The leaves of Syzygium species have been collected from Perak, Malaysia. The essential oils extracted by using the conventional Hydro-distillation extraction procedure and analyzed by using Gas chromatography System attached with Mass Spectrometry (GCMS). Twenty-seven constituents were found in Syzygium species in which the major constituents include: α-Pinene (3.94%), α-Thujene (2.16%), α-Terpineol (2.95%), g-Elemene (2.89%) and D-Limonene (14.59%). The aim of this study was the comparison between the evaluated data and existing literature to fortify the major variability through statistical analysis.

Keywords: chemotaxonomy, cluster analysis, essential oil, medicinal plants, statistical analysis

Procedia PDF Downloads 280
4132 Clustering of Association Rules of ISIS & Al-Qaeda Based on Similarity Measures

Authors: Tamanna Goyal, Divya Bansal, Sanjeev Sofat

Abstract:

In world-threatening terrorist attacks, where early detection, distinction, and prediction are effective diagnosis techniques and for functionally accurate and precise analysis of terrorism data, there are so many data mining & statistical approaches to assure accuracy. The computational extraction of derived patterns is a non-trivial task which comprises specific domain discovery by means of sophisticated algorithm design and analysis. This paper proposes an approach for similarity extraction by obtaining the useful attributes from the available datasets of terrorist attacks and then applying feature selection technique based on the statistical impurity measures followed by clustering techniques on the basis of similarity measures. On the basis of degree of participation of attributes in the rules, the associative dependencies between the attacks are analyzed. Consequently, to compute the similarity among the discovered rules, we applied a weighted similarity measure. Finally, the rules are grouped by applying using hierarchical clustering. We have applied it to an open source dataset to determine the usability and efficiency of our technique, and a literature search is also accomplished to support the efficiency and accuracy of our results.

Keywords: association rules, clustering, similarity measure, statistical approaches

Procedia PDF Downloads 291
4131 Underrepresentation of Right Middle Cerebral Infarct: A Statistical Parametric Mapping

Authors: Wi-Sun Ryu, Eun-Kee Bae

Abstract:

Prior studies have shown that patients with right hemispheric stroke are likely to seek medical service compared with those with left hemispheric stroke. However, the underlying mechanism for this phenomenon is unknown. In the present study, we generated lesion probability maps in a patient with right and left middle cerebral artery infarct and statistically compared. We found that precentral gyrus-Brodmann area 44, a language area in the left hemisphere - involvement was significantly higher in patients with left hemispheric stroke. This finding suggests that a language dysfunction was more noticeable, thereby taking more patients to hospitals.

Keywords: cerebral infarct, brain MRI, statistical parametric mapping, middle cerebral infarct

Procedia PDF Downloads 312
4130 The Subjective Experiences of First-Time Chinese Parents' Transition to Parenthood and the Impact on Their Marital Satisfaction

Authors: Amy Yee Kai Wan

Abstract:

The arrival of a new baby to first-time parents is an exciting and joyous occasion, yet, the daunting task of raising the baby and the uncertainty of how it will affect the lives of the couple present a great challenge to them. This study examines the causes of conflicts and needs of the new parents through a qualitative research of five pairs of new parents in Hong Kong. Semi-structured in-depth qualitative interviews were conducted to explore the changes babies brought to their marriages, sources of support they received and found important and assistance they felt would help with their transition to parenthood. Thematic analysis was used to analyze the commonalities and differences between the five couples’ subjective experiences. Narrative analysis was used to compare the experiences of two parents who are the under-functioning parent of the couple, to study the different strategies they employed in response to the over-functioning parent and to analyze how the marital relationships were affected. Four main themes emerged from the study: 1) Change and adjustment in marital relationship, 2) parents’ level of involvement, 3) support in childcaring, and 4) challenges faced by the parents. Results from the study indicated that father involvement in childcaring is an important element in mother’s marital satisfaction Father’s marital satisfaction is dependent upon the mother – her satisfaction with father involvement, which affects the mother’s marital satisfaction. Marital convergence and co-parenting alliance acted as moderators for marital satisfaction. Implications from the study include: i) offering programmes that improve couple relationship and enhance parenting efficacy in tandem to improve overall marital satisfaction, and ii) offering prenatal counselling services or provide education to new parents from prenatal to postnatal period that can help couples reduce discrepancies between expectations and realities of their marital relationship and parenting responsibilities after their baby is born.

Keywords: co-parenting alliance, father involvement, marital convergence, maternal gatekeeping, new parents, transition to parenthood

Procedia PDF Downloads 125
4129 Evaluation of the Mechanical Behavior of a Retaining Wall Structure on a Weathered Soil through Probabilistic Methods

Authors: P. V. S. Mascarenhas, B. C. P. Albuquerque, D. J. F. Campos, L. L. Almeida, V. R. Domingues, L. C. S. M. Ozelim

Abstract:

Retaining slope structures are increasingly considered in geotechnical engineering projects due to extensive urban cities growth. These kinds of engineering constructions may present instabilities over the time and may require reinforcement or even rebuilding of the structure. In this context, statistical analysis is an important tool for decision making regarding retaining structures. This study approaches the failure probability of the construction of a retaining wall over the debris of an old and collapsed one. The new solution’s extension length will be of approximately 350 m and will be located over the margins of the Lake Paranoá, Brasilia, in the capital of Brazil. The building process must also account for the utilization of the ruins as a caisson. A series of in situ and laboratory experiments defined local soil strength parameters. A Standard Penetration Test (SPT) defined the in situ soil stratigraphy. Also, the parameters obtained were verified using soil data from a collection of masters and doctoral works from the University of Brasília, which is similar to the local soil. Initial studies show that the concrete wall is the proper solution for this case, taking into account the technical, economic and deterministic analysis. On the other hand, in order to better analyze the statistical significance of the factor-of-safety factors obtained, a Monte Carlo analysis was performed for the concrete wall and two more initial solutions. A comparison between the statistical and risk results generated for the different solutions indicated that a Gabion solution would better fit the financial and technical feasibility of the project.

Keywords: economical analysis, probability of failure, retaining walls, statistical analysis

Procedia PDF Downloads 383
4128 Modeling of Daily Global Solar Radiation Using Ann Techniques: A Case of Study

Authors: Said Benkaciali, Mourad Haddadi, Abdallah Khellaf, Kacem Gairaa, Mawloud Guermoui

Abstract:

In this study, many experiments were carried out to assess the influence of the input parameters on the performance of multilayer perceptron which is one the configuration of the artificial neural networks. To estimate the daily global solar radiation on the horizontal surface, we have developed some models by using seven combinations of twelve meteorological and geographical input parameters collected from a radiometric station installed at Ghardaïa city (southern of Algeria). For selecting of best combination which provides a good accuracy, six statistical formulas (or statistical indicators) have been evaluated, such as the root mean square errors, mean absolute errors, correlation coefficient, and determination coefficient. We noted that multilayer perceptron techniques have the best performance, except when the sunshine duration parameter is not included in the input variables. The maximum of determination coefficient and correlation coefficient are equal to 98.20 and 99.11%. On the other hand, some empirical models were developed to compare their performances with those of multilayer perceptron neural networks. Results obtained show that the neural networks techniques give the best performance compared to the empirical models.

Keywords: empirical models, multilayer perceptron neural network, solar radiation, statistical formulas

Procedia PDF Downloads 312
4127 A Survey on Internet of Things and Fog Computing as a Platform for Internet of Things

Authors: Samira Kalantary, Sara Taghipour, Mansoure Ghias Abadi

Abstract:

The Internet of Things (IOT) is a technological revolution that represents the future of computing and communications. IOT is the convergence of Internet with RFID, NFC, Sensor, and smart objects. Fog Computing is the natural platform for IOT. At present, the IOT as a new network communication technology has rapidly shifted from concept to application under fog computing virtual storage computing platform. In this paper, we describe everything about IOT and difference between cloud computing and fog computing.

Keywords: cloud computing, fog computing, Internet of Things (IoT), IOT application

Procedia PDF Downloads 547
4126 Emile Meyerson's Philosophy of Science in Lacan's Early Theories

Authors: Hugo T. Jorge, Richard T. Simanke

Abstract:

Lacan’s work addresses overarching issues concerning the scientific intelligibility of the subject in its philosophical sense. Even though his reflection is not, strictly speaking, philosophy of science, it contains many traits that are typical of this branch of philosophy. However, the relation between Lacan’s early thought and the philosophy of science of the time is often disregarded or only incompletely accounted for in Lacanian scholarship. French philosopher of science Emile Meyerson was often implicitly or explicitly referred to in Lacan’s works, yet few publications can be found on their relationship. The objective of this paper is to contribute to the analysis of this relationship, indicating some of its possible implications. For this, the convergence between Meyerson’s doctrine of science and Lacan’s works between 1936 and 1953 is discussed, as well as the conditions under which Lacan’s reception of Meyerson’s ideas take place. In conclusion, it is argued that this convergence allows for the clarification of important issues in Lacan’s early work, such as the concept of imago, his views on the nature of truth, and his thesis of the anthropomorphism of natural sciences. Meyerson’s argument for the permanence of common sense within science makes Lacan’s claims on the anthropomorphism of natural sciences more understandable. Similarly, Meyerson’s views on the epistemological shortfall of the Principle of Identity sheds some light on Lacan’s 1936 critique of associationistic concepts of engram and truth and may be at the origins of his antirealist and anti-idealist stances. Meyerson’s Principle of Identity is also related to some aspects of Lacan’s concept of imago. The imago understood as the unconscious condition for the identity in time of family figures in childhood, would be an excellent expression of the Principle of Identity. In this sense, the Principle of Identity may be linked to the concept of imaginary as developed by Lacan in the 1950s. However, Lacan considerably distorts Meyerson’s views in his 1936 critique of Freud’s concept of libido. Finally, a possible relationship between Lacan’s late concept of the real and Meyerson’s concept of the irrational is suggested.

Keywords: imaginary, Lacan, Meyerson, philosophy of science, real

Procedia PDF Downloads 146
4125 Confidence Intervals for Process Capability Indices for Autocorrelated Data

Authors: Jane A. Luke

Abstract:

Persistent pressure passed on to manufacturers from escalating consumer expectations and the ever growing global competitiveness have produced a rapidly increasing interest in the development of various manufacturing strategy models. Academic and industrial circles are taking keen interest in the field of manufacturing strategy. Many manufacturing strategies are currently centered on the traditional concepts of focused manufacturing capabilities such as quality, cost, dependability and innovation. Process capability indices was conducted assuming that the process under study is in statistical control and independent observations are generated over time. However, in practice, it is very common to come across processes which, due to their inherent natures, generate autocorrelated observations. The degree of autocorrelation affects the behavior of patterns on control charts. Even, small levels of autocorrelation between successive observations can have considerable effects on the statistical properties of conventional control charts. When observations are autocorrelated the classical control charts exhibit nonrandom patterns and lack of control. Many authors have considered the effect of autocorrelation on the performance of statistical process control charts. In this paper, the effect of autocorrelation on confidence intervals for different PCIs was included. Stationary Gaussian processes is explained. Effect of autocorrelation on PCIs is described in detail. Confidence intervals for Cp and Cpk are constructed for PCIs when data are both independent and autocorrelated. Confidence intervals for Cp and Cpk are computed. Approximate lower confidence limits for various Cpk are computed assuming AR(1) model for the data. Simulation studies and industrial examples are considered to demonstrate the results.

Keywords: autocorrelation, AR(1) model, Bissell’s approximation, confidence intervals, statistical process control, specification limits, stationary Gaussian processes

Procedia PDF Downloads 357
4124 Quasi-Photon Monte Carlo on Radiative Heat Transfer: An Importance Sampling and Learning Approach

Authors: Utkarsh A. Mishra, Ankit Bansal

Abstract:

At high temperature, radiative heat transfer is the dominant mode of heat transfer. It is governed by various phenomena such as photon emission, absorption, and scattering. The solution of the governing integrodifferential equation of radiative transfer is a complex process, more when the effect of participating medium and wavelength properties are taken into consideration. Although a generic formulation of such radiative transport problem can be modeled for a wide variety of problems with non-gray, non-diffusive surfaces, there is always a trade-off between simplicity and accuracy of the problem. Recently, solutions of complicated mathematical problems with statistical methods based on randomization of naturally occurring phenomena have gained significant importance. Photon bundles with discrete energy can be replicated with random numbers describing the emission, absorption, and scattering processes. Photon Monte Carlo (PMC) is a simple, yet powerful technique, to solve radiative transfer problems in complicated geometries with arbitrary participating medium. The method, on the one hand, increases the accuracy of estimation, and on the other hand, increases the computational cost. The participating media -generally a gas, such as CO₂, CO, and H₂O- present complex emission and absorption spectra. To model the emission/absorption accurately with random numbers requires a weighted sampling as different sections of the spectrum carries different importance. Importance sampling (IS) was implemented to sample random photon of arbitrary wavelength, and the sampled data provided unbiased training of MC estimators for better results. A better replacement to uniform random numbers is using deterministic, quasi-random sequences. Halton, Sobol, and Faure Low-Discrepancy Sequences are used in this study. They possess better space-filling performance than the uniform random number generator and gives rise to a low variance, stable Quasi-Monte Carlo (QMC) estimators with faster convergence. An optimal supervised learning scheme was further considered to reduce the computation costs of the PMC simulation. A one-dimensional plane-parallel slab problem with participating media was formulated. The history of some randomly sampled photon bundles is recorded to train an Artificial Neural Network (ANN), back-propagation model. The flux was calculated using the standard quasi PMC and was considered to be the training target. Results obtained with the proposed model for the one-dimensional problem are compared with the exact analytical and PMC model with the Line by Line (LBL) spectral model. The approximate variance obtained was around 3.14%. Results were analyzed with respect to time and the total flux in both cases. A significant reduction in variance as well a faster rate of convergence was observed in the case of the QMC method over the standard PMC method. However, the results obtained with the ANN method resulted in greater variance (around 25-28%) as compared to the other cases. There is a great scope of machine learning models to help in further reduction of computation cost once trained successfully. Multiple ways of selecting the input data as well as various architectures will be tried such that the concerned environment can be fully addressed to the ANN model. Better results can be achieved in this unexplored domain.

Keywords: radiative heat transfer, Monte Carlo Method, pseudo-random numbers, low discrepancy sequences, artificial neural networks

Procedia PDF Downloads 185
4123 Exploratory Study of the Influencing Factors for Hotels' Competitors

Authors: Asma Ameur, Dhafer Malouche

Abstract:

Hotel competitiveness research is an essential phase of the marketing strategy for any hotel. Certainly, knowing the hotels' competitors helps the hotelier to grasp its position in the market and the citizen to make the right choice in picking a hotel. Thus, competitiveness is an important indicator that can be influenced by various factors. In fact, the issue of competitiveness, this ability to cope with competition, remains a difficult and complex concept to define and to exploit. Therefore, the purpose of this article is to make an exploratory study to calculate a competitiveness indicator for hotels. Further on, this paper makes it possible to determine the criteria of direct or indirect effect on the image and the perception of a hotel. The actual research is used to look into the right model for hotel ‘competitiveness. For this reason, we exploit different theoretical contributions in the field of machine learning. Thus, we use some statistical techniques such as the Principal Component Analysis (PCA) to reduce the dimensions, as well as other techniques of statistical modeling. This paper presents a survey covering of the techniques and methods in hotel competitiveness research. Furthermore, this study allows us to deduct the significant variables that influence the determination of hotel’s competitors. Lastly, the discussed experiences in this article found that the hotel competitors are influenced by several factors with different rates.

Keywords: competitiveness, e-reputation, hotels' competitors, online hotel’ review, principal component analysis, statistical modeling

Procedia PDF Downloads 86
4122 Electricity Generation from Renewables and Targets: An Application of Multivariate Statistical Techniques

Authors: Filiz Ersoz, Taner Ersoz, Tugrul Bayraktar

Abstract:

Renewable energy is referred to as "clean energy" and common popular support for the use of renewable energy (RE) is to provide electricity with zero carbon dioxide emissions. This study provides useful insight into the European Union (EU) RE, especially, into electricity generation obtained from renewables, and their targets. The objective of this study is to identify groups of European countries, using multivariate statistical analysis and selected indicators. The hierarchical clustering method is used to decide the number of clusters for EU countries. The conducted statistical hierarchical cluster analysis is based on the Ward’s clustering method and squared Euclidean distances. Hierarchical cluster analysis identified eight distinct clusters of European countries. Then, non-hierarchical clustering (k-means) method was applied. Discriminant analysis was used to determine the validity of the results with data normalized by Z score transformation. To explore the relationship between the selected indicators, correlation coefficients were computed. The results of the study reveal the current situation of RE in European Union Member States.

Keywords: share of electricity generation, k-means clustering, discriminant, CO2 emission

Procedia PDF Downloads 393
4121 Monte Carlo Methods and Statistical Inference of Multitype Branching Processes

Authors: Ana Staneva, Vessela Stoimenova

Abstract:

A parametric estimation of the MBP with Power Series offspring distribution family is considered in this paper. The MLE for the parameters is obtained in the case when the observable data are incomplete and consist only with the generation sizes of the family tree of MBP. The parameter estimation is calculated by using the Monte Carlo EM algorithm. The estimation for the posterior distribution and for the offspring distribution parameters are calculated by using the Bayesian approach and the Gibbs sampler. The article proposes various examples with bivariate branching processes together with computational results, simulation and an implementation using R.

Keywords: Bayesian, branching processes, EM algorithm, Gibbs sampler, Monte Carlo methods, statistical estimation

Procedia PDF Downloads 390
4120 TDApplied: An R Package for Machine Learning and Inference with Persistence Diagrams

Authors: Shael Brown, Reza Farivar

Abstract:

Persistence diagrams capture valuable topological features of datasets that other methods cannot uncover. Still, their adoption in data pipelines has been limited due to the lack of publicly available tools in R (and python) for analyzing groups of them with machine learning and statistical inference. In an easy-to-use and scalable R package called TDApplied, we implement several applied analysis methods tailored to groups of persistence diagrams. The two main contributions of our package are comprehensiveness (most functions do not have implementations elsewhere) and speed (shown through benchmarking against other R packages). We demonstrate applications of the tools on simulated data to illustrate how easily practical analyses of any dataset can be enhanced with topological information.

Keywords: machine learning, persistence diagrams, R, statistical inference

Procedia PDF Downloads 49
4119 Predicting the Relationship Between the Corona Virus Anxiety and Psychological Hardiness in Staff Working at Hospital in Shiraz Iran

Authors: Gholam Reza Mirzaei, Mehran Roost

Abstract:

This research was conducted with the aim of predicting the relationship between coronavirus anxiety and psychological hardiness in employees working at Shahid Beheshti Hospital in Shiraz. The current research design was descriptive and correlational. The statistical population of the research consisted of all the employees of Shahid Beheshti Hospital in Shiraz in 2021. From among the statistical population, 220 individuals were selected and studied based on available sampling. To collect data, Kobasa's psychological hardiness questionnaire and coronavirus anxiety questionnaire were used. After collecting the data, the scores of the participants were analyzed using Pearson's correlation coefficient multiple regression analysis and SPSS-24 statistical software. The results of Pearson's correlation coefficient showed that there is a significant negative correlation between psychological hardiness and its components (challenge, commitment, and control) with coronavirus anxiety; also, psychological hardiness with a beta coefficient of 0.20 could predict coronavirus anxiety in hospital employees. Based on the results, plans can be made to enhance psychological hardiness through educational workshops to relieve the anxiety of the healthcare staff.

Keywords: the corona virus, commitment, hospital employees, psychological hardiness

Procedia PDF Downloads 28