Search results for: multivariate models.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6977

Search results for: multivariate models.

6857 Small Target Recognition Based on Trajectory Information

Authors: Saad Alkentar, Abdulkareem Assalem

Abstract:

Recognizing small targets has always posed a significant challenge in image analysis. Over long distances, the image signal-to-noise ratio tends to be low, limiting the amount of useful information available to detection systems. Consequently, visual target recognition becomes an intricate task to tackle. In this study, we introduce a Track Before Detect (TBD) approach that leverages target trajectory information (coordinates) to effectively distinguish between noise and potential targets. By reframing the problem as a multivariate time series classification, we have achieved remarkable results. Specifically, our TBD method achieves an impressive 97% accuracy in separating target signals from noise within a mere half-second time span (consisting of 10 data points). Furthermore, when classifying the identified targets into our predefined categories—airplane, drone, and bird—we achieve an outstanding classification accuracy of 96% over a more extended period of 1.5 seconds (comprising 30 data points).

Keywords: small targets, drones, trajectory information, TBD, multivariate time series

Procedia PDF Downloads 17
6856 Reliability Estimation of Bridge Structures with Updated Finite Element Models

Authors: Ekin Ozer

Abstract:

Assessment of structural reliability is essential for efficient use of civil infrastructure which is subjected hazardous events. Dynamic analysis of finite element models is a commonly used tool to simulate structural behavior and estimate its performance accordingly. However, theoretical models purely based on preliminary assumptions and design drawings may deviate from the actual behavior of the structure. This study proposes up-to-date reliability estimation procedures which engages actual bridge vibration data modifying finite element models for finite element model updating and performing reliability estimation, accordingly. The proposed method utilizes vibration response measurements of bridge structures to identify modal parameters, then uses these parameters to calibrate finite element models which are originally based on design drawings. The proposed method does not only show that reliability estimation based on updated models differs from the original models, but also infer that non-updated models may overestimate the structural capacity.

Keywords: earthquake engineering, engineering vibrations, reliability estimation, structural health monitoring

Procedia PDF Downloads 180
6855 Detection of Chaos in General Parametric Model of Infectious Disease

Authors: Javad Khaligh, Aghileh Heydari, Ali Akbar Heydari

Abstract:

Mathematical epidemiological models for the spread of disease through a population are used to predict the prevalence of a disease or to study the impacts of treatment or prevention measures. Initial conditions for these models are measured from statistical data collected from a population since these initial conditions can never be exact, the presence of chaos in mathematical models has serious implications for the accuracy of the models as well as how epidemiologists interpret their findings. This paper confirms the chaotic behavior of a model for dengue fever and SI by investigating sensitive dependence, bifurcation, and 0-1 test under a variety of initial conditions.

Keywords: epidemiological models, SEIR disease model, bifurcation, chaotic behavior, 0-1 test

Procedia PDF Downloads 296
6854 Innovative Methods of Improving Train Formation in Freight Transport

Authors: Jaroslav Masek, Juraj Camaj, Eva Nedeliakova

Abstract:

The paper is focused on the operational model for transport the single wagon consignments on railway network by using two different models of train formation. The paper gives an overview of possibilities of improving the quality of transport services. Paper deals with two models used in problematic of train formatting - time continuously and time discrete. By applying these models in practice, the transport company can guarantee a higher quality of service and expect increasing of transport performance. The models are also applicable into others transport networks. The models supplement a theoretical problem of train formation by new ways of looking to affecting the organization of wagon flows.

Keywords: train formation, wagon flows, marshalling yard, railway technology

Procedia PDF Downloads 413
6853 A Data-Driven Monitoring Technique Using Combined Anomaly Detectors

Authors: Fouzi Harrou, Ying Sun, Sofiane Khadraoui

Abstract:

Anomaly detection based on Principal Component Analysis (PCA) was studied intensively and largely applied to multivariate processes with highly cross-correlated process variables. Monitoring metrics such as the Hotelling's T2 and the Q statistics are usually used in PCA-based monitoring to elucidate the pattern variations in the principal and residual subspaces, respectively. However, these metrics are ill suited to detect small faults. In this paper, the Exponentially Weighted Moving Average (EWMA) based on the Q and T statistics, T2-EWMA and Q-EWMA, were developed for detecting faults in the process mean. The performance of the proposed methods was compared with that of the conventional PCA-based fault detection method using synthetic data. The results clearly show the benefit and the effectiveness of the proposed methods over the conventional PCA method, especially for detecting small faults in highly correlated multivariate data.

Keywords: data-driven method, process control, anomaly detection, dimensionality reduction

Procedia PDF Downloads 269
6852 The Role of Self-Confidence, Adversity Quotient, and Self-Efficacy Critical Thinking: Path Model

Authors: Bayu Dwi Cahyo, Ekohariadi, Theodorus Wiyanto Wibowo, I. G. P. Asto Budithahjanto, Eppy Yundra

Abstract:

The objective of this study is to examine the effects of self-confidence, adversity quotient, and self-efficacy variables on critical thinking. This research's participants are 137 cadets of Aviation Polytechnics of Surabaya with the sampling technique that was purposive sampling. In this study, the data collection method used a questionnaire with Linkert-scale and distributed or given to respondents by the specified number of samples. The SPSS AMOS v23 was used to test a number of a priori multivariate growth curve models and examining relationships between the variables via path analysis. The result of path analysis was (χ² = 88.463, df= 71, χ² /df= 1.246, GFI= .914, CFI= .988, P= .079, AGFI= .873, TLI= .985, RMSEA= .043). According to the analysis, there is a positive and significant relationship between self-confidence, adversity quotient, and self-efficacy variables on critical thinking.

Keywords: self-confidence, adversity quotient, self-efficacy variables, critical thinking

Procedia PDF Downloads 121
6851 SOM Map vs Hopfield Neural Network: A Comparative Study in Microscopic Evacuation Application

Authors: Zouhour Neji Ben Salem

Abstract:

Microscopic evacuation focuses on the evacuee behavior and way of search of safety place in an egress situation. In recent years, several models handled microscopic evacuation problem. Among them, we have proposed Artificial Neural Network (ANN) as an alternative to mathematical models that can deal with such problem. In this paper, we present two ANN models: SOM map and Hopfield Network used to predict the evacuee behavior in a disaster situation. These models are tested in a real case, the second floor of Tunisian children hospital evacuation in case of fire. The two models are studied and compared in order to evaluate their performance.

Keywords: artificial neural networks, self-organization map, hopfield network, microscopic evacuation, fire building evacuation

Procedia PDF Downloads 371
6850 Possibility of Making Ceramic Models from Condemned Plaster of Paris (Pop) Moulds for Ceramics Production in Edo State Nigeria

Authors: Osariyekemwen, Daniel Nosakhare

Abstract:

Some ceramic wastes, such as discarded (condemn) Plaster of Paris (POP) in Auchi Polytechnic, Edo State, constitute environmental hazards. This study, therefore, bridges the forgoing gaps by undertaking the use of these discarded (POP) moulds to produced ceramic models for making casting moulds for mass production. This is in line with the possibility of using this medium to properly manage the discarded (condemn) Plaster of Paris (POP) that littered our immediate environment. Presently these are major wastes disposal in the department. Hence, the study has been made to fabricate sanitary miniature models and contract fuse models, respectively. Findings arising from this study show that discarded (condemn) Plaster of Paris (POP) can be carved when to set it neither shrink nor expand; hence warping is quite unusual. Above all, it also gives good finishing with little deterioration with time when compared to clay models.

Keywords: plaster of Paris, condemn, moulds, models, production

Procedia PDF Downloads 152
6849 Short Review on Models to Estimate the Risk in the Financial Area

Authors: Tiberiu Socaciu, Tudor Colomeischi, Eugenia Iancu

Abstract:

Business failure affects in various proportions shareholders, managers, lenders (banks), suppliers, customers, the financial community, government and society as a whole. In the era in which we have telecommunications networks, exists an interdependence of markets, the effect of a failure of a company is relatively instant. To effectively manage risk exposure is thus require sophisticated support systems, supported by analytical tools to measure, monitor, manage and control operational risks that may arise. As we know, bankruptcy is a phenomenon that managers do not want no matter what stage of life is the company they direct / lead. In the analysis made by us, by the nature of economic models that are reviewed (Altman, Conan-Holder etc.), estimating the risk of bankruptcy of a company corresponds to some extent with its own business cycle tracing of the company. Various models for predicting bankruptcy take into account direct / indirect aspects such as market position, company growth trend, competition structure, characteristics and customer retention, organization and distribution, location etc. From the perspective of our research we will now review the economic models known in theory and practice for estimating the risk of bankruptcy; such models are based on indicators drawn from major accounting firms.

Keywords: Anglo-Saxon models, continental models, national models, statistical models

Procedia PDF Downloads 378
6848 Improve Safety Performance of Un-Signalized Intersections in Oman

Authors: Siham G. Farag

Abstract:

The main objective of this paper is to provide a new methodology for road safety assessment in Oman through the development of suitable accident prediction models. GLM technique with Poisson or NBR using SAS package was carried out to develop these models. The paper utilized the accidents data of 31 un-signalized T-intersections during three years. Five goodness-of-fit measures were used to assess the overall quality of the developed models. Two types of models were developed separately; the flow-based models including only traffic exposure functions, and the full models containing both exposure functions and other significant geometry and traffic variables. The results show that, traffic exposure functions produced much better fit to the accident data. The most effective geometric variables were major-road mean speed, minor-road 85th percentile speed, major-road lane width, distance to the nearest junction, and right-turn curb radius. The developed models can be used for intersection treatment or upgrading and specify the appropriate design parameters of T- intersections. Finally, the models presented in this thesis reflect the intersection conditions in Oman and could represent the typical conditions in several countries in the middle east area, especially gulf countries.

Keywords: accidents prediction models (APMs), generalized linear model (GLM), T-intersections, Oman

Procedia PDF Downloads 242
6847 Modelling and Maping Malnutrition Toddlers in Bojonegoro Regency with Mixed Geographically Weighted Regression Approach

Authors: Elvira Mustikawati P.H., Iis Dewi Ratih, Dita Amelia

Abstract:

Bojonegoro has proclaimed a policy of zero malnutrition. Therefore, as an effort to solve the cases of malnutrition children in Bojonegoro, this study used the approach geographically Mixed Weighted Regression (MGWR) to determine the factors that influence the percentage of malnourished children under five in which factors can be divided into locally influential factor in each district and global factors that influence throughout the district. Based on the test of goodness of fit models, R2 and AIC values in GWR models are better than MGWR models. R2 and AIC values in MGWR models are 84.37% and 14.28, while the GWR models respectively are 91.04% and -62.04. Based on the analysis with GWR models, District Sekar, Bubulan, Gondang, and Dander is a district with three predictor variables (percentage of vitamin A, the percentage of births assisted health personnel, and the percentage of clean water) that significantly influence the percentage of malnourished children under five.

Keywords: GWR, MGWR, R2, AIC

Procedia PDF Downloads 256
6846 An Overview of New Era in Food Science and Technology

Authors: Raana Babadi Fathipour

Abstract:

Strict prerequisites of logical diaries united ought to demonstrate the exploratory information is (in)significant from the statistical point of view and has driven a soak increment within the utilization and advancement of the factual program. It is essential that the utilization of numerical and measurable strategies, counting chemometrics and many other factual methods/algorithms in nourishment science and innovation has expanded steeply within the final 20 a long time. Computational apparatuses accessible can be utilized not as it were to run factual investigations such as univariate and bivariate tests as well as multivariate calibration and improvement of complex models but also to run reenactments of distinctive scenarios considering a set of inputs or essentially making expectations for particular information sets or conditions. Conducting a fast look within the most legitimate logical databases (Pubmed, ScienceDirect, Scopus), it is conceivable to watch that measurable strategies have picked up a colossal space in numerous regions.

Keywords: food science, food technology, food safety, computational tools

Procedia PDF Downloads 38
6845 A Comparative Analysis of E-Government Quality Models

Authors: Abdoullah Fath-Allah, Laila Cheikhi, Rafa E. Al-Qutaish, Ali Idri

Abstract:

Many quality models have been used to measure e-government portals quality. However, the absence of an international consensus for e-government portals quality models results in many differences in terms of quality attributes and measures. The aim of this paper is to compare and analyze the existing e-government quality models proposed in literature (those that are based on ISO standards and those that are not) in order to propose guidelines to build a good and useful e-government portals quality model. Our findings show that, there is no e-government portal quality model based on the new international standard ISO 25010. Besides that, the quality models are not based on a best practice model to allow agencies to both; measure e-government portals quality and identify missing best practices for those portals.

Keywords: e-government, portal, best practices, quality model, ISO, standard, ISO 25010, ISO 9126

Procedia PDF Downloads 529
6844 Archaeology Study of Soul Houses in Ancient Egypt on Five Models in the Grand Egyptian Museum

Authors: Ayman Aboelkassem, Mahmoud Ali

Abstract:

Introduction: The models of soul houses have appeared in the prehistory, old kingdom and middle kingdom period. These soul houses represented the imagination of the deceased about his house in the afterlife, some of these soul houses were two floors and the study will examine five models of soul houses which were discovered near Saqqara site by an Egyptian mission. These models had been transferred to The Grand Egyptian Museum (GEM) to be ready to display at the new museum. We focus on models of soul houses (GEM Numbers, 1276, 1280, 1281, 1282, 8711) these models of soul houses were related to the old kingdom period. These models were all made of pottery, the five models have an oval shape and were decorated with relief. Methodology: The study will focus on the development of soul houses during the different periods in ancient Egypt, the function of soul houses, the kind of offerings which were put in it and the symbolism of the offerings colors in ancient Egyptian believe. Conclusion: This study is useful for the heritage and ancient civilizations especially when we talk about opening new museums like The Grand Egyptian Museum which will display a new collection of soul houses. The study of soul houses and The kinds of offerings which put in it reflect the economic situation in the Egyptian society and kinds of oils which were famous in ancient Egypt.

Keywords: archaeology study, Grand Egyptian Museum, relief, soul houses

Procedia PDF Downloads 226
6843 Variation in the Traditional Knowledge of Curcuma longa L. in North-Eastern Algeria

Authors: A. Bouzabata, A. Boukhari

Abstract:

Curcuma longa L. (Zingiberaceae), commonly known as turmeric, has a long history of traditional uses for culinary purposes as a spice and a food colorant. The present study aimed to document the ethnobotanical knowledge about Curcuma longa and to assess the variation in the herbalists’ experience in Northeastern Algeria. Data were collected by semi-structured questionnaires and direct interviews with 30 herbalists. Ethnobotanical indices, including the fidelity level (FL%), the relative frequency citation (RFC) and use value (UV) were determined by quantitative methods. Diversity in the knowledge was analyzed using univariate, non-parametric and multivariate statistical methods. Three main categories of uses were recorded for C. longa: for food, for medicine and for cosmetic purposes. As a medicine, turmeric was used for the treatment of gastrointestinal, dermatological and hepatic diseases. Medicinal and food uses were correlated with both forms of use (rhizome and powder). The age group did not influence the use. Multivariate analyses showed a significant variation in traditional knowledge, associated with the use value, origin, quality and efficacy of the drug. These findings suggested that the geographical origin of C. longa affected the use in Algeria.

Keywords: curcuma, indices, knowledge, variation

Procedia PDF Downloads 516
6842 Detection of Abnormal Process Behavior in Copper Solvent Extraction by Principal Component Analysis

Authors: Kirill Filianin, Satu-Pia Reinikainen, Tuomo Sainio

Abstract:

Frequent measurements of product steam quality create a data overload that becomes more and more difficult to handle. In the current study, plant history data with multiple variables was successfully treated by principal component analysis to detect abnormal process behavior, particularly, in copper solvent extraction. The multivariate model is based on the concentration levels of main process metals recorded by the industrial on-stream x-ray fluorescence analyzer. After mean-centering and normalization of concentration data set, two-dimensional multivariate model under principal component analysis algorithm was constructed. Normal operating conditions were defined through control limits that were assigned to squared score values on x-axis and to residual values on y-axis. 80 percent of the data set were taken as the training set and the multivariate model was tested with the remaining 20 percent of data. Model testing showed successful application of control limits to detect abnormal behavior of copper solvent extraction process as early warnings. Compared to the conventional techniques of analyzing one variable at a time, the proposed model allows to detect on-line a process failure using information from all process variables simultaneously. Complex industrial equipment combined with advanced mathematical tools may be used for on-line monitoring both of process streams’ composition and final product quality. Defining normal operating conditions of the process supports reliable decision making in a process control room. Thus, industrial x-ray fluorescence analyzers equipped with integrated data processing toolbox allows more flexibility in copper plant operation. The additional multivariate process control and monitoring procedures are recommended to apply separately for the major components and for the impurities. Principal component analysis may be utilized not only in control of major elements’ content in process streams, but also for continuous monitoring of plant feed. The proposed approach has a potential in on-line instrumentation providing fast, robust and cheap application with automation abilities.

Keywords: abnormal process behavior, failure detection, principal component analysis, solvent extraction

Procedia PDF Downloads 283
6841 Running the Athena Vortex Lattice Code in JAVA through the Java Native Interface

Authors: Paul Okonkwo, Howard Smith

Abstract:

This paper describes a methodology to integrate the Athena Vortex Lattice Aerodynamic Software for automated operation in a multivariate optimisation of the Blended Wing Body Aircraft. The Athena Vortex Lattice code developed at the Massachusetts Institute of Technology allows for the aerodynamic analysis of aircraft using the vortex lattice method. Ordinarily, the Athena Vortex Lattice operation requires a text file containing the aircraft geometry to be loaded into the AVL solver in order to determine the aerodynamic forces and moments. However, automated operation will be required to enable integration into a multidisciplinary optimisation framework. Automated AVL operation within the JAVA design environment will nonetheless require a modification and recompilation of AVL source code into an executable file capable of running on windows and other platforms without the –X11 libraries. This paper describes the procedure for the integrating the FORTRAN written AVL software for automated operation within the multivariate design synthesis optimisation framework for the conceptual design of the BWB aircraft.

Keywords: aerodynamics, automation, optimisation, AVL, JNI

Procedia PDF Downloads 539
6840 Turbulent Forced Convection of Cu-Water Nanofluid: CFD Models Comparison

Authors: I. Behroyan, P. Ganesan, S. He, S. Sivasankaran

Abstract:

This study compares the predictions of five types of Computational Fluid Dynamics (CFD) models, including two single-phase models (i.e. Newtonian and non-Newtonian) and three two-phase models (Eulerian-Eulerian, mixture and Eulerian-Lagrangian), to investigate turbulent forced convection of Cu-water nanofluid in a tube with a constant heat flux on the tube wall. The Reynolds (Re) number of the flow is between 10,000 and 25,000, while the volume fraction of Cu particles used is in the range of 0 to 2%. The commercial CFD package of ANSYS-Fluent is used. The results from the CFD models are compared with results from experimental investigations from literature. According to the results of this study, non-Newtonian single-phase model, in general, does not show a good agreement with Xuan and Li correlation in prediction of Nu number. Eulerian-Eulerian model gives inaccurate results expect for φ=0.5%. Mixture model gives a maximum error of 15%. Newtonian single-phase model and Eulerian-Lagrangian model, in overall, are the recommended models. This work can be used as a reference for selecting an appreciate model for future investigation. The study also gives a proper insight about the important factors such as Brownian motion, fluid behavior parameters and effective nanoparticle conductivity which should be considered or changed by the each model.

Keywords: heat transfer, nanofluid, single-phase models, two-phase models

Procedia PDF Downloads 462
6839 Comparison of Deep Convolutional Neural Networks Models for Plant Disease Identification

Authors: Megha Gupta, Nupur Prakash

Abstract:

Identification of plant diseases has been performed using machine learning and deep learning models on the datasets containing images of healthy and diseased plant leaves. The current study carries out an evaluation of some of the deep learning models based on convolutional neural network (CNN) architectures for identification of plant diseases. For this purpose, the publicly available New Plant Diseases Dataset, an augmented version of PlantVillage dataset, available on Kaggle platform, containing 87,900 images has been used. The dataset contained images of 26 diseases of 14 different plants and images of 12 healthy plants. The CNN models selected for the study presented in this paper are AlexNet, ZFNet, VGGNet (four models), GoogLeNet, and ResNet (three models). The selected models are trained using PyTorch, an open-source machine learning library, on Google Colaboratory. A comparative study has been carried out to analyze the high degree of accuracy achieved using these models. The highest test accuracy and F1-score of 99.59% and 0.996, respectively, were achieved by using GoogLeNet with Mini-batch momentum based gradient descent learning algorithm.

Keywords: comparative analysis, convolutional neural networks, deep learning, plant disease identification

Procedia PDF Downloads 165
6838 The Promotion Effects for a Supply Chain System with a Dominant Retailer

Authors: Tai-Yue Wang, Yi-Ho Chen

Abstract:

In this study, we investigate a two-echelon supply chain with two suppliers and three retailers among which one retailer dominates other retailers. A price competition demand function is used to model this dominant retailer, which is leading market. The promotion strategies and negotiation schemes are integrated to form decision-making models under different scenarios. These models are then formulated into different mathematical programming models. The decision variables such as promotional costs, retailer prices, wholesale price, and order quantity are included in these models. At last, the distributions of promotion costs under different cost allocation strategies are discussed. Finally, an empirical example used to validate our models. The results from this empirical example show that the profit model will create the largest profit for the supply chain but with different profit-sharing results. At the same time, the more risk a member can take, the more profits are distributed to that member in the utility model.

Keywords: supply chain, price promotion, mathematical models, dominant retailer

Procedia PDF Downloads 381
6837 Institutional Capacity and Corruption: Evidence from Brazil

Authors: Dalson Figueiredo, Enivaldo Rocha, Ranulfo Paranhos, José Alexandre

Abstract:

This paper analyzes the effects of institutional capacity on corruption. Methodologically, the research design combines both descriptive and multivariate statistics to examine two original datasets based on secondary data. In particular, we employ a principal component model to estimate an indicator of institutional capacity for both state audit institutions and subnational judiciary courts. Then, we estimate the effect of institutional capacity on two dependent variables: (1) incidence of administrative irregularities and (2) time elapsed to judge corruption cases. The preliminary results using ordinary least squares, negative binomial and Tobit models suggest the same conclusions: higher the institutional audit capacity, higher is the probability of detecting a corruption case. On the other hand, higher the institutional capacity of state judiciary, the lower is the time to judge corruption cases.

Keywords: institutional capacity, corruption, state level institutions, evidence from Brazil

Procedia PDF Downloads 326
6836 The Effect of Institutions on Economic Growth: An Analysis Based on Bayesian Panel Data Estimation

Authors: Mohammad Anwar, Shah Waliullah

Abstract:

This study investigated panel data regression models. This paper used Bayesian and classical methods to study the impact of institutions on economic growth from data (1990-2014), especially in developing countries. Under the classical and Bayesian methodology, the two-panel data models were estimated, which are common effects and fixed effects. For the Bayesian approach, the prior information is used in this paper, and normal gamma prior is used for the panel data models. The analysis was done through WinBUGS14 software. The estimated results of the study showed that panel data models are valid models in Bayesian methodology. In the Bayesian approach, the effects of all independent variables were positively and significantly affected by the dependent variables. Based on the standard errors of all models, we must say that the fixed effect model is the best model in the Bayesian estimation of panel data models. Also, it was proved that the fixed effect model has the lowest value of standard error, as compared to other models.

Keywords: Bayesian approach, common effect, fixed effect, random effect, Dynamic Random Effect Model

Procedia PDF Downloads 53
6835 Forced-Choice Measurement Models of Behavioural, Social, and Emotional Skills: Theory, Research, and Development

Authors: Richard Roberts, Anna Kravtcova

Abstract:

Introduction: The realisation that personality can change over the course of a lifetime has led to a new companion model to the Big Five, the behavioural, emotional, and social skills approach (BESSA). BESSA hypothesizes that this set of skills represents how the individual is thinking, feeling, and behaving when the situation calls for it, as opposed to traits, which represent how someone tends to think, feel, and behave averaged across situations. The five major skill domains share parallels with the Big Five Factor (BFF) model creativity and innovation (openness), self-management (conscientiousness), social engagement (extraversion), cooperation (agreeableness), and emotional resilience (emotional stability) skills. We point to noteworthy limitations in the current operationalisation of BESSA skills (i.e., via Likert-type items) and offer up a different measurement approach: forced choice. Method: In this forced-choice paradigm, individuals were given three skill items (e.g., managing my time) and asked to select one response they believed they were “worst at” and “best at”. The Thurstonian IRT models allow these to be placed on a normative scale. Two multivariate studies (N = 1178) were conducted with a 22-item forced-choice version of the BESSA, a published measure of the BFF, and various criteria. Findings: Confirmatory factor analysis of the forced-choice assessment showed acceptable model fit (RMSEA<0.06), while reliability estimates were reasonable (around 0.70 for each construct). Convergent validity evidence was as predicted (correlations between 0.40 and 0.60 for corresponding BFF and BESSA constructs). Notable was the extent the forced-choice BESSA assessment improved upon test-criterion relationships over and above the BFF. For example, typical regression models find BFF personality accounting for 25% of the variance in life satisfaction scores; both studies showed incremental gains over the BFF exceeding 6% (i.e., BFF and BESSA together accounted for over 31% of the variance in both studies). Discussion: Forced-choice measurement models offer up the promise of creating equated test forms that may unequivocally measure skill gains and are less prone to fakability and reference bias effects. Implications for practitioners are discussed, especially those interested in selection, succession planning, and training and development. We also discuss how the forced choice method can be applied to other constructs like emotional immunity, cross-cultural competence, and self-estimates of cognitive ability.

Keywords: Big Five, forced-choice method, BFF, methods of measurements

Procedia PDF Downloads 67
6834 Optimal Maintenance Policy for a Partially Observable Two-Unit System

Authors: Leila Jafari, Viliam Makis, G. B. Akram Khaleghei

Abstract:

In this paper, we present a maintenance model of a two-unit series system with economic dependence. Unit#1, which is considered to be more expensive and more important, is subject to condition monitoring (CM) at equidistant, discrete time epochs and unit#2, which is not subject to CM, has a general lifetime distribution. The multivariate observation vectors obtained through condition monitoring carry partial information about the hidden state of unit#1, which can be in a healthy or a warning state while operating. Only the failure state is assumed to be observable for both units. The objective is to find an optimal opportunistic maintenance policy minimizing the long-run expected average cost per unit time. The problem is formulated and solved in the partially observable semi-Markov decision process framework. An effective computational algorithm for finding the optimal policy and the minimum average cost is developed and illustrated by a numerical example.

Keywords: condition-based maintenance, semi-Markov decision process, multivariate Bayesian control chart, partially observable system, two-unit system

Procedia PDF Downloads 436
6833 In situ Real-Time Multivariate Analysis of Methanolysis Monitoring of Sunflower Oil Using FTIR

Authors: Pascal Mwenge, Tumisang Seodigeng

Abstract:

The combination of world population and the third industrial revolution led to high demand for fuels. On the other hand, the decrease of global fossil 8fuels deposits and the environmental air pollution caused by these fuels has compounded the challenges the world faces due to its need for energy. Therefore, new forms of environmentally friendly and renewable fuels such as biodiesel are needed. The primary analytical techniques for methanolysis yield monitoring have been chromatography and spectroscopy, these methods have been proven reliable but are more demanding, costly and do not provide real-time monitoring. In this work, the in situ monitoring of biodiesel from sunflower oil using FTIR (Fourier Transform Infrared) has been studied; the study was performed using EasyMax Mettler Toledo reactor equipped with a DiComp (Diamond) probe. The quantitative monitoring of methanolysis was performed by building a quantitative model with multivariate calibration using iC Quant module from iC IR 7.0 software. 15 samples of known concentrations were used for the modelling which were taken in duplicate for model calibration and cross-validation, data were pre-processed using mean centering and variance scale, spectrum math square root and solvent subtraction. These pre-processing methods improved the performance indexes from 7.98 to 0.0096, 11.2 to 3.41, 6.32 to 2.72, 0.9416 to 0.9999, RMSEC, RMSECV, RMSEP and R2Cum, respectively. The R2 value of 1 (training), 0.9918 (test), 0.9946 (cross-validation) indicated the fitness of the model built. The model was tested against univariate model; small discrepancies were observed at low concentration due to unmodelled intermediates but were quite close at concentrations above 18%. The software eliminated the complexity of the Partial Least Square (PLS) chemometrics. It was concluded that the model obtained could be used to monitor methanol of sunflower oil at industrial and lab scale.

Keywords: biodiesel, calibration, chemometrics, methanolysis, multivariate analysis, transesterification, FTIR

Procedia PDF Downloads 126
6832 Management of Cultural Heritage: Bologna Gates

Authors: Alfonso Ippolito, Cristiana Bartolomei

Abstract:

A growing demand is felt today for realistic 3D models enabling the cognition and popularization of historical-artistic heritage. Evaluation and preservation of Cultural Heritage is inextricably connected with the innovative processes of gaining, managing, and using knowledge. The development and perfecting of techniques for acquiring and elaborating photorealistic 3D models, made them pivotal elements for popularizing information of objects on the scale of architectonic structures.

Keywords: cultural heritage, databases, non-contact survey, 2D-3D models

Procedia PDF Downloads 392
6831 Data-Centric Anomaly Detection with Diffusion Models

Authors: Sheldon Liu, Gordon Wang, Lei Liu, Xuefeng Liu

Abstract:

Anomaly detection, also referred to as one-class classification, plays a crucial role in identifying product images that deviate from the expected distribution. This study introduces Data-centric Anomaly Detection with Diffusion Models (DCADDM), presenting a systematic strategy for data collection and further diversifying the data with image generation via diffusion models. The algorithm addresses data collection challenges in real-world scenarios and points toward data augmentation with the integration of generative AI capabilities. The paper explores the generation of normal images using diffusion models. The experiments demonstrate that with 30% of the original normal image size, modeling in an unsupervised setting with state-of-the-art approaches can achieve equivalent performances. With the addition of generated images via diffusion models (10% equivalence of the original dataset size), the proposed algorithm achieves better or equivalent anomaly localization performance.

Keywords: diffusion models, anomaly detection, data-centric, generative AI

Procedia PDF Downloads 44
6830 Private and Public Health Sector Difference on Client Satisfaction: Results from Secondary Data Analysis in Sindh, Pakistan

Authors: Wajiha Javed, Arsalan Jabbar, Nelofer Mehboob, Muhammad Tafseer, Zahid Memon

Abstract:

Introduction: Researchers globally have strived to explore diverse factors that augment the continuation and uptake of family planning methods. Clients’ satisfaction is one of the core determinants facilitating continuation of family planning methods. There is a major debate yet scanty evidence to contrast public and private sectors with respect to client satisfaction. The objective of this study is to compare quality-of-care provided by public and private sectors of Pakistan through a client satisfaction lens. Methods: We used Pakistan Demographic Heath Survey 2012-13 dataset (Sindh province) on a total of 3133 Married Women of Reproductive Age (MWRA) aged 15-49 years. Source of family planning (public/private sector) was the main exposure variable. Outcome variable was client satisfaction judged by ten different dimensions of client satisfaction. Means and standard deviations were calculated for continuous variable while for categorical variable frequencies and percentages were computed. For univariate analysis, Chi-square/Fisher Exact test was used to find an association between clients’ satisfaction in public and private sectors. Ten different multivariate models were made. Variables were checked for multi-collinearity, confounding, and interaction, and then advanced logistic regression was used to explore the relationship between client satisfaction and dependent outcome after adjusting for all known confounding factors and results are presented as OR and AOR (95% CI). Results: Multivariate analyses showed that clients were less satisfied in contraceptive provision from private sector as compared to public sector (AOR 0.92,95% CI 0.63-1.68) even though the result was not statistically significant. Clients were more satisfied from private sector as compared to the public sector with respect to other determinants of quality-of-care (follow-up care (AOR 3.29, 95% CI 1.95-5.55), infection prevention (AOR 2.41, 95% CI 1.60-3.62), counseling services (AOR 2.01, 95% CI 1.27-3.18, timely treatment (AOR 3.37, 95% CI 2.20-5.15), attitude of staff (AOR 2.23, 95% CI 1.50-3.33), punctuality of staff (AOR 2.28, 95% CI 1.92-4.13), timely referring (AOR 2.34, 95% CI 1.63-3.35), staff cooperation (AOR 1.75, 95% CI 1.22-2.51) and complications handling (AOR 2.27, 95% CI 1.56-3.29).

Keywords: client satisfaction, family planning, public private partnership, quality of care

Procedia PDF Downloads 385
6829 Copula Markov Switching Multifractal Models for Forecasting Value-at-Risk

Authors: Giriraj Achari, Malay Bhattacharyya

Abstract:

In this paper, the effectiveness of Copula Markov Switching Multifractal (MSM) models at forecasting Value-at-Risk of a two-stock portfolio is studied. The innovations are allowed to be drawn from distributions that can capture skewness and leptokurtosis, which are well documented empirical characteristics observed in financial returns. The candidate distributions considered for this purpose are Johnson-SU, Pearson Type-IV and α-Stable distributions. The two univariate marginal distributions are combined using the Student-t copula. The estimation of all parameters is performed by Maximum Likelihood Estimation. Finally, the models are compared in terms of accurate Value-at-Risk (VaR) forecasts using tests of unconditional coverage and independence. It is found that Copula-MSM-models with leptokurtic innovation distributions perform slightly better than Copula-MSM model with Normal innovations. Copula-MSM models, in general, produce better VaR forecasts as compared to traditional methods like Historical Simulation method, Variance-Covariance approach and Copula-Generalized Autoregressive Conditional Heteroscedasticity (Copula-GARCH) models.

Keywords: Copula, Markov Switching, multifractal, value-at-risk

Procedia PDF Downloads 142
6828 Digital Marketing Maturity Models: Overview and Comparison

Authors: Elina Bakhtieva

Abstract:

The variety of available digital tools, strategies and activities might confuse and disorient even an experienced marketer. This applies in particular to B2B companies, which are usually less flexible in uptaking of digital technology than B2C companies. B2B companies are lacking a framework that corresponds to the specifics of the B2B business, and which helps to evaluate a company’s capabilities and to choose an appropriate path. A B2B digital marketing maturity model helps to fill this gap. However, modern marketing offers no widely approved digital marketing maturity model, and thus, some marketing institutions provide their own tools. The purpose of this paper is building an optimized B2B digital marketing maturity model based on a SWOT (strengths, weaknesses, opportunities, and threats) analysis of existing models. The current study provides an analytical review of the existing digital marketing maturity models with open access. The results of the research are twofold. First, the provided SWOT analysis outlines the main advantages and disadvantages of existing models. Secondly, the strengths of existing digital marketing maturity models, helps to identify the main characteristics and the structure of an optimized B2B digital marketing maturity model. The research findings indicate that only one out of three analyzed models could be used as a separate tool. This study is among the first examining the use of maturity models in digital marketing. It helps businesses to choose between the existing digital marketing models, the most effective one. Moreover, it creates a base for future research on digital marketing maturity models. This study contributes to the emerging B2B digital marketing literature by providing a SWOT analysis of the existing digital marketing maturity models and suggesting a structure and main characteristics of an optimized B2B digital marketing maturity model.

Keywords: B2B digital marketing strategy, digital marketing, digital marketing maturity model, SWOT analysis

Procedia PDF Downloads 308