Search results for: Bayesian adjustment
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 793

Search results for: Bayesian adjustment

583 Digital Image Steganography with Multilayer Security

Authors: Amar Partap Singh Pharwaha, Balkrishan Jindal

Abstract:

In this paper, a new method is developed for hiding image in a digital image with multilayer security. In the proposed method, the secret image is encrypted in the first instance using a flexible matrix based symmetric key to add first layer of security. Then another layer of security is added to the secret data by encrypting the ciphered data using Pythagorean Theorem method. The ciphered data bits (4 bits) produced after double encryption are then embedded within digital image in the spatial domain using Least Significant Bits (LSBs) substitution. To improve the image quality of the stego-image, an improved form of pixel adjustment process is proposed. To evaluate the effectiveness of the proposed method, image quality metrics including Peak Signal-to-Noise Ratio (PSNR), Mean Square Error (MSE), entropy, correlation, mean value and Universal Image Quality Index (UIQI) are measured. It has been found experimentally that the proposed method provides higher security as well as robustness. In fact, the results of this study are quite promising.

Keywords: Pythagorean theorem, pixel adjustment, ciphered data, image hiding, least significant bit, flexible matrix

Procedia PDF Downloads 310
582 The Pursuit of Marital Sustainability Inspiring by Successful Matrimony of Two Distinguishable Indonesian Ethnics as a Learning Process

Authors: Mutiara Amalina Khairisa, Purnama Arafah, Rahayu Listiana Ramli

Abstract:

In recent years, so many cases of divorce increasingly occur. Betrayal in form of infidelity, less communication one another, economically problems, selfishness of two sides, intervening parents from both sides which frequently occurs in Asia, especially in Indonesia, the differences of both principles and beliefs, “Sense of Romantism” depletion, role confict, a large difference in the purpose of marriage,and sex satisfaction are expected as the primary factors of the causes of divorce. Every couple of marriage wants to reach happy life in their family but severe problems brought about by either of those main factors come as a reasonable cause of failure marriage. The purpose of this study is to find out how marital adjustment and supporting factors in ensuring the success of that previous marital adjusment are inseparable two things assumed as a framework can affect the success in marriage becoming a resolution to reduce the desires to divorce. Those two inseparable things are able to become an aspect of learning from the success of the different ethnics marriage to keep holding on wholeness.

Keywords: marital adjustment, marital sustainability, learning process, successful ethnicity differences marriage, basical cultural values

Procedia PDF Downloads 403
581 Regular or Irregular: An Investigation of Medicine Consumption Pattern with Poisson Mixture Model

Authors: Lichung Jen, Yi Chun Liu, Kuan-Wei Lee

Abstract:

Fruitful data has been accumulated in database nowadays and is commonly used as support for decision-making. In the healthcare industry, hospital, for instance, ordering pharmacy inventory is one of the key decision. With large drug inventory, the current cost increases and its expiration dates might lead to future issue, such as drug disposal and recycle. In contrast, underestimating demand of the pharmacy inventory, particularly standing drugs, affects the medical treatment and possibly hospital reputation. Prescription behaviour of hospital physicians is one of the critical factor influencing this decision, particularly irregular prescription behaviour. If a drug’s usage amount in the month is irregular and less than the regular usage, it may cause the trend of subsequent stockpiling. On the contrary, if a drug has been prescribed often than expected, it may result in insufficient inventory. We proposed a hierarchical Bayesian mixture model with two components to identify physicians’ regular/irregular prescription patterns with probabilities. Heterogeneity of hospital is considered in our proposed hierarchical Bayes model. The result suggested that modeling the prescription patterns of physician is beneficial for estimating the order quantity of medication and pharmacy inventory management of the hospital. Managerial implication and future research are discussed.

Keywords: hierarchical Bayesian model, poission mixture model, medicines prescription behavior, irregular behavior

Procedia PDF Downloads 98
580 Psychopathy Evaluation for People with Intellectual Disability Living in Institute Using Chinese Version of the Psychopathology Inventory

Authors: Lin Fu-Gong

Abstract:

Background: As WHO announced, people with intellectual disability (ID) were vulnerable to mental health problems. And there were few custom-made mental health scales for those people to monitor their mental health. Those people with mental problems often accompanied worse prognosis and usually became to be a heavier burden on the caregivers. Purpose: In this study, we intend to develop a psychopathy scale as a practical tool for monitoring the mental health for people with ID living in institute. Methods: In this study, we adopt the Psychopathology Inventory for Mentally Retarded Adults developed by professor Matson with certified reliability and validity in Western countries with Dr. Matson’s agreement in advance. We first translated the inventory into Chinese validated version considering the domestic culture background in the past year. And the validity and reliability evaluation of mental health status using this inventory among the people with intellectual living in the institute were done. Results: The inventory includes eight psychiatric disorder scales as schizophrenic, affective, psychosexual, adjustment, anxiety, somatoform, personality disorders and inappropriate mental adjustment. Around 83% of 40 invested people, who randomly selected from the institute, were found to have at least one disorder who were recommended with medical help by two evaluators. Among the residents examined, somatoform disorder and inappropriate mental adjustment were most popular with 60% and 78% people respectively. Conclusion: The result showed the prevalence psychiatric disorders were relatively high among people with ID in institute and the mental problems need to be further cared and followed for their mental health. The results showed that the psychopathology inventory was a useful tool for institute caregiver, manager and for long-term care policy to the government. In the coming stage, we plan to extend the use of the valid Chinese version inventory among more different type institutes for people with ID to establish their dynamic mental health status including medical need, relapse and rehabilitation to promote their mental health.

Keywords: intellectual disability, psychiatric disorder, psychopathology inventory, mental health, the institute

Procedia PDF Downloads 239
579 Genetic and Environmental Variation in Reproductive and Lactational Performance of Holstein Cattle

Authors: Ashraf Ward

Abstract:

Effect of calving interval on 305 day milk yield for first three lactations was studied in order to increase efficiency of selection schemes and to more efficiently manage Holstein cows that have been raised on small farms in Libya. Results obtained by processing data of 1476 cows, managed in 935 small scale farms, pointed out that current calving interval significantly affects on milk production for first three lactations (p<0.05). Preceding calving interval affected 305 day milk yield (p<0.05) in second lactation only. Linear regression model accounted for 20-25 % of the total variance of 305 day milk yield. Extension of calving interval over 420, 430, 450 days for first, second and third lactations respectively, did not increase milk production when converted to 305 day lactation. Stochastic relations between calving interval and calving age and month are moderated. Values of Pierson’s correlation coefficients ranged 0.38 to 0.69. Adjustment of milk production in order to reduce effect of calving interval on total phenotypic variance of milk yield is valid for first lactation only. Adjustment of 305 day milk yield for second and third lactations in order to reduce effects of factors “calving age and month” brings about, at the same time, elimination of calving interval effect.

Keywords: milk yield, Holstien, non genetic, calving

Procedia PDF Downloads 389
578 Joint Modeling of Longitudinal and Time-To-Event Data with Latent Variable

Authors: Xinyuan Y. Song, Kai Kang

Abstract:

Joint models for analyzing longitudinal and survival data are widely used to investigate the relationship between a failure time process and time-variant predictors. A common assumption in conventional joint models in the survival analysis literature is that all predictors are observable. However, this assumption may not always be supported because unobservable traits, namely, latent variables, which are indirectly observable and should be measured through multiple observed variables, are commonly encountered in the medical, behavioral, and financial research settings. In this study, a joint modeling approach to deal with this feature is proposed. The proposed model comprises three parts. The first part is a dynamic factor analysis model for characterizing latent variables through multiple observed indicators over time. The second part is a random coefficient trajectory model for describing the individual trajectories of latent variables. The third part is a proportional hazard model for examining the effects of time-invariant predictors and the longitudinal trajectories of time-variant latent risk factors on hazards of interest. A Bayesian approach coupled with a Markov chain Monte Carlo algorithm to perform statistical inference. An application of the proposed joint model to a study on the Alzheimer's disease neuroimaging Initiative is presented.

Keywords: Bayesian analysis, joint model, longitudinal data, time-to-event data

Procedia PDF Downloads 112
577 Smart Web Services in the Web of Things

Authors: Sekkal Nawel

Abstract:

The Web of Things (WoT), integration of smart technologies from the Internet or network to Web architecture or application, is becoming more complex, larger, and dynamic. The WoT is associated with various elements such as sensors, devices, networks, protocols, data, functionalities, and architectures to perform services for stakeholders. These services operate in the context of the interaction of stakeholders and the WoT elements. Such context is becoming a key information source from which data are of various nature and uncertain, thus leading to complex situations. In this paper, we take interest in the development of intelligent Web services. The key ingredients of this “intelligent” notion are the context diversity, the necessity of a semantic representation to manage complex situations and the capacity to reason with uncertain data. In this perspective, we introduce a multi-layered architecture based on a generic intelligent Web service model dealing with various contexts, which proactively predict future situations and reactively respond to real-time situations in order to support decision-making. For semantic context data representation, we use PR-OWL, which is a probabilistic ontology based on Multi-Entity Bayesian Networks (MEBN). PR-OWL is flexible enough to represent complex, dynamic, and uncertain contexts, the key requirements of the development for the intelligent Web services. A case study was carried out using the proposed architecture for intelligent plant watering to show the role of proactive and reactive contextual reasoning in terms of WoT.

Keywords: smart web service, the web of things, context reasoning, proactive, reactive, multi-entity bayesian networks, PR-OWL

Procedia PDF Downloads 36
576 Bayesian Inference for High Dimensional Dynamic Spatio-Temporal Models

Authors: Sofia M. Karadimitriou, Kostas Triantafyllopoulos, Timothy Heaton

Abstract:

Reduced dimension Dynamic Spatio-Temporal Models (DSTMs) jointly describe the spatial and temporal evolution of a function observed subject to noise. A basic state space model is adopted for the discrete temporal variation, while a continuous autoregressive structure describes the continuous spatial evolution. Application of such a DSTM relies upon the pre-selection of a suitable reduced set of basic functions and this can present a challenge in practice. In this talk, we propose an online estimation method for high dimensional spatio-temporal data based upon DSTM and we attempt to resolve this issue by allowing the basis to adapt to the observed data. Specifically, we present a wavelet decomposition in order to obtain a parsimonious approximation of the spatial continuous process. This parsimony can be achieved by placing a Laplace prior distribution on the wavelet coefficients. The aim of using the Laplace prior, is to filter wavelet coefficients with low contribution, and thus achieve the dimension reduction with significant computation savings. We then propose a Hierarchical Bayesian State Space model, for the estimation of which we offer an appropriate particle filter algorithm. The proposed methodology is illustrated using real environmental data.

Keywords: multidimensional Laplace prior, particle filtering, spatio-temporal modelling, wavelets

Procedia PDF Downloads 400
575 Developing High-Definition Flood Inundation Maps (HD-Fims) Using Raster Adjustment with Scenario Profiles (RASPTM)

Authors: Robert Jacobsen

Abstract:

Flood inundation maps (FIMs) are an essential tool in communicating flood threat scenarios to the public as well as in floodplain governance. With an increasing demand for online raster FIMs, the FIM State-of-the-Practice (SOP) is rapidly advancing to meet the dual requirements for high-resolution and high-accuracy—or High-Definition. Importantly, today’s technology also enables the resolution of problems of local—neighborhood-scale—bias errors that often occur in FIMs, even with the use of SOP two-dimensional flood modeling. To facilitate the development of HD-FIMs, a new GIS method--Raster Adjustment with Scenario Profiles, RASPTM—is described for adjusting kernel raster FIMs to match refined scenario profiles. With RASPTM, flood professionals can prepare HD-FIMs for a wide range of scenarios with available kernel rasters, including kernel rasters prepared from vector FIMs. The paper provides detailed procedures for RASPTM, along with an example of applying RASPTM to prepare an HD-FIM for the August 2016 Flood in Louisiana using both an SOP kernel raster and a kernel raster derived from an older vector-based flood insurance rate map. The accuracy of the HD-FIMs achieved with the application of RASPTM to the two kernel rasters is evaluated.

Keywords: hydrology, mapping, high-definition, inundation

Procedia PDF Downloads 32
574 Data Driven Infrastructure Planning for Offshore Wind farms

Authors: Isha Saxena, Behzad Kazemtabrizi, Matthias C. M. Troffaes, Christopher Crabtree

Abstract:

The calculations done at the beginning of the life of a wind farm are rarely reliable, which makes it important to conduct research and study the failure and repair rates of the wind turbines under various conditions. This miscalculation happens because the current models make a simplifying assumption that the failure/repair rate remains constant over time. This means that the reliability function is exponential in nature. This research aims to create a more accurate model using sensory data and a data-driven approach. The data cleaning and data processing is done by comparing the Power Curve data of the wind turbines with SCADA data. This is then converted to times to repair and times to failure timeseries data. Several different mathematical functions are fitted to the times to failure and times to repair data of the wind turbine components using Maximum Likelihood Estimation and the Posterior expectation method for Bayesian Parameter Estimation. Initial results indicate that two parameter Weibull function and exponential function produce almost identical results. Further analysis is being done using the complex system analysis considering the failures of each electrical and mechanical component of the wind turbine. The aim of this project is to perform a more accurate reliability analysis that can be helpful for the engineers to schedule maintenance and repairs to decrease the downtime of the turbine.

Keywords: reliability, bayesian parameter inference, maximum likelihood estimation, weibull function, SCADA data

Procedia PDF Downloads 31
573 A Literature Review on Banks’ Profitability and Risk Adjustment Decisions

Authors: Libena Cernohorska, Barbora Sutorova, Petr Teply

Abstract:

There are pending discussions over an impact of global regulatory efforts on banks. In this paper we present a literature review on the profitability-risk-capital relationship in banking. Research papers dealing with this topic can be divided into two groups: the first group focusing on a capital-risk relationship and the second group analyzing a capital-profitability relationship. The first group investigates whether the imposition of stricter capital requirements reduces risk-taking incentives of banks based on a simultaneous equations model. Their model pioneered the idea that the changes in both capital and risk have endogenous and exogenous components. The results obtained by the authors indicate that changes in the capital level are positively related to the changes in asset risk. The second group of the literature concentrating solely on the relationship between the level of held capital and bank profitability is limited. Nevertheless, there are a lot of studies dealing with the banks’ profitability as such, where bank capital is very often included as an explanatory variable. Based on the literature review of dozens of relevant papers in this study, an empirical research on banks’ profitability and risk adjustment decisions under new banking rules Basel III rules can be easily undertaken.

Keywords: bank, Basel III, capital, decision making, profitability, risk, simultaneous equations model

Procedia PDF Downloads 457
572 Case-Based Reasoning for Modelling Random Variables in the Reliability Assessment of Existing Structures

Authors: Francesca Marsili

Abstract:

The reliability assessment of existing structures with probabilistic methods is becoming an increasingly important and frequent engineering task. However probabilistic reliability methods are based on an exhaustive knowledge of the stochastic modeling of the variables involved in the assessment; at the moment standards for the modeling of variables are absent, representing an obstacle to the dissemination of probabilistic methods. The framework according to probability distribution functions (PDFs) are established is represented by the Bayesian statistics, which uses Bayes Theorem: a prior PDF for the considered parameter is established based on information derived from the design stage and qualitative judgments based on the engineer past experience; then, the prior model is updated with the results of investigation carried out on the considered structure, such as material testing, determination of action and structural properties. The application of Bayesian statistics arises two different kind of problems: 1. The results of the updating depend on the engineer previous experience; 2. The updating of the prior PDF can be performed only if the structure has been tested, and quantitative data that can be statistically manipulated have been collected; performing tests is always an expensive and time consuming operation; furthermore, if the considered structure is an ancient building, destructive tests could compromise its cultural value and therefore should be avoided. In order to solve those problems, an interesting research path is represented by investigating Artificial Intelligence (AI) techniques that can be useful for the automation of the modeling of variables and for the updating of material parameters without performing destructive tests. Among the others, one that raises particular attention in relation to the object of this study is constituted by Case-Based Reasoning (CBR). In this application, cases will be represented by existing buildings where material tests have already been carried out and an updated PDFs for the material mechanical parameters has been computed through a Bayesian analysis. Then each case will be composed by a qualitative description of the material under assessment and the posterior PDFs that describe its material properties. The problem that will be solved is the definition of PDFs for material parameters involved in the reliability assessment of the considered structure. A CBR system represent a good candi¬date in automating the modelling of variables because: 1. Engineers already draw an estimation of the material properties based on the experience collected during the assessment of similar structures, or based on similar cases collected in literature or in data-bases; 2. Material tests carried out on structure can be easily collected from laboratory database or from literature; 3. The system will provide the user of a reliable probabilistic description of the variables involved in the assessment that will also serve as a tool in support of the engineer’s qualitative judgments. Automated modeling of variables can help in spreading probabilistic reliability assessment of existing buildings in the common engineering practice, and target at the best intervention and further tests on the structure; CBR represents a technique which may help to achieve this.

Keywords: reliability assessment of existing buildings, Bayesian analysis, case-based reasoning, historical structures

Procedia PDF Downloads 313
571 Computational Identification of Signalling Pathways in Protein Interaction Networks

Authors: Angela U. Makolo, Temitayo A. Olagunju

Abstract:

The knowledge of signaling pathways is central to understanding the biological mechanisms of organisms since it has been identified that in eukaryotic organisms, the number of signaling pathways determines the number of ways the organism will react to external stimuli. Signaling pathways are studied using protein interaction networks constructed from protein-protein interaction data obtained using high throughput experimental procedures. However, these high throughput methods are known to produce very high rates of false positive and negative interactions. In order to construct a useful protein interaction network from this noisy data, computational methods are applied to validate the protein-protein interactions. In this study, a computational technique to identify signaling pathways from a protein interaction network constructed using validated protein-protein interaction data was designed. A weighted interaction graph of the Saccharomyces cerevisiae (Baker’s Yeast) organism using the proteins as the nodes and interactions between them as edges was constructed. The weights were obtained using Bayesian probabilistic network to estimate the posterior probability of interaction between two proteins given the gene expression measurement as biological evidence. Only interactions above a threshold were accepted for the network model. A pathway was formalized as a simple path in the interaction network from a starting protein and an ending protein of interest. We were able to identify some pathway segments, one of which is a segment of the pathway that signals the start of the process of meiosis in S. cerevisiae.

Keywords: Bayesian networks, protein interaction networks, Saccharomyces cerevisiae, signalling pathways

Procedia PDF Downloads 508
570 Automatic Multi-Label Image Annotation System Guided by Firefly Algorithm and Bayesian Method

Authors: Saad M. Darwish, Mohamed A. El-Iskandarani, Guitar M. Shawkat

Abstract:

Nowadays, the amount of available multimedia data is continuously on the rise. The need to find a required image for an ordinary user is a challenging task. Content based image retrieval (CBIR) computes relevance based on the visual similarity of low-level image features such as color, textures, etc. However, there is a gap between low-level visual features and semantic meanings required by applications. The typical method of bridging the semantic gap is through the automatic image annotation (AIA) that extracts semantic features using machine learning techniques. In this paper, a multi-label image annotation system guided by Firefly and Bayesian method is proposed. Firstly, images are segmented using the maximum variance intra cluster and Firefly algorithm, which is a swarm-based approach with high convergence speed, less computation rate and search for the optimal multiple threshold. Feature extraction techniques based on color features and region properties are applied to obtain the representative features. After that, the images are annotated using translation model based on the Net Bayes system, which is efficient for multi-label learning with high precision and less complexity. Experiments are performed using Corel Database. The results show that the proposed system is better than traditional ones for automatic image annotation and retrieval.

Keywords: feature extraction, feature selection, image annotation, classification

Procedia PDF Downloads 557
569 Automatic Adjustment of Thresholds via Closed-Loop Feedback Mechanism for Solder Paste Inspection

Authors: Chia-Chen Wei, Pack Hsieh, Jeffrey Chen

Abstract:

Surface Mount Technology (SMT) is widely used in the area of the electronic assembly in which the electronic components are mounted to the surface of the printed circuit board (PCB). Most of the defects in the SMT process are mainly related to the quality of solder paste printing. These defects lead to considerable manufacturing costs in the electronics assembly industry. Therefore, the solder paste inspection (SPI) machine for controlling and monitoring the amount of solder paste printing has become an important part of the production process. So far, the setting of the SPI threshold is based on statistical analysis and experts’ experiences to determine the appropriate threshold settings. Because the production data are not normal distribution and there are various variations in the production processes, defects related to solder paste printing still occur. In order to solve this problem, this paper proposes an online machine learning algorithm, called the automatic threshold adjustment (ATA) algorithm, and closed-loop architecture in the SMT process to determine the best threshold settings. Simulation experiments prove that our proposed threshold settings improve the accuracy from 99.85% to 100%.

Keywords: big data analytics, Industry 4.0, SPI threshold setting, surface mount technology

Procedia PDF Downloads 90
568 Supplier Risk Management: A Multivariate Statistical Modelling and Portfolio Optimization Based Approach for Supplier Delivery Performance Development

Authors: Jiahui Yang, John Quigley, Lesley Walls

Abstract:

In this paper, the authors develop a stochastic model regarding the investment in supplier delivery performance development from a buyer’s perspective. The authors propose a multivariate model through a Multinomial-Dirichlet distribution within an Empirical Bayesian inference framework, representing both the epistemic and aleatory uncertainties in deliveries. A closed form solution is obtained and the lower and upper bound for both optimal investment level and expected profit under uncertainty are derived. The theoretical properties provide decision makers with useful insights regarding supplier delivery performance improvement problems where multiple delivery statuses are involved. The authors also extend the model from a single supplier investment into a supplier portfolio, using a Lagrangian method to obtain a theoretical expression for an optimal investment level and overall expected profit. The model enables a buyer to know how the marginal expected profit/investment level of each supplier changes with respect to the budget and which supplier should be invested in when additional budget is available. An application of this model is illustrated in a simulation study. Overall, the main contribution of this study is to provide an optimal investment decision making framework for supplier development, taking into account multiple delivery statuses as well as multiple projects.

Keywords: decision making, empirical bayesian, portfolio optimization, supplier development, supply chain management

Procedia PDF Downloads 266
567 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)

Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton

Abstract:

Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.

Keywords: cold-start learning, expectation propagation, multi-armed bandits, Thompson Sampling, variational inference

Procedia PDF Downloads 81
566 Novel Inference Algorithm for Gaussian Process Classification Model with Multiclass and Its Application to Human Action Classification

Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park

Abstract:

In this paper, we propose a novel inference algorithm for the multi-class Gaussian process classification model that can be used in the field of human behavior recognition. This algorithm can drive simultaneously both a posterior distribution of a latent function and estimators of hyper-parameters in a Gaussian process classification model with multi-class. Our algorithm is based on the Laplace approximation (LA) technique and variational EM framework. This is performed in two steps: called expectation and maximization steps. First, in the expectation step, using the Bayesian formula and LA technique, we derive approximately the posterior distribution of the latent function indicating the possibility that each observation belongs to a certain class in the Gaussian process classification model. Second, in the maximization step, using a derived posterior distribution of latent function, we compute the maximum likelihood estimator for hyper-parameters of a covariance matrix necessary to define prior distribution for latent function. These two steps iteratively repeat until a convergence condition satisfies. Moreover, we apply the proposed algorithm with human action classification problem using a public database, namely, the KTH human action data set. Experimental results reveal that the proposed algorithm shows good performance on this data set.

Keywords: bayesian rule, gaussian process classification model with multiclass, gaussian process prior, human action classification, laplace approximation, variational EM algorithm

Procedia PDF Downloads 304
565 Simulation of Glass Breakage Using Voronoi Random Field Tessellations

Authors: Michael A. Kraus, Navid Pourmoghaddam, Martin Botz, Jens Schneider, Geralt Siebert

Abstract:

Fragmentation analysis of tempered glass gives insight into the quality of the tempering process and defines a certain degree of safety as well. Different standard such as the European EN 12150-1 or the American ASTM C 1048/CPSC 16 CFR 1201 define a minimum number of fragments required for soda-lime safety glass on the basis of fragmentation test results for classification. This work presents an approach for the glass breakage pattern prediction using a Voronoi Tesselation over Random Fields. The random Voronoi tessellation is trained with and validated against data from several breakage patterns. The fragments in observation areas of 50 mm x 50 mm were used for training and validation. All glass specimen used in this study were commercially available soda-lime glasses at three different thicknesses levels of 4 mm, 8 mm and 12 mm. The results of this work form a Bayesian framework for the training and prediction of breakage patterns of tempered soda-lime glass using a Voronoi Random Field Tesselation. Uncertainties occurring in this process can be well quantified, and several statistical measures of the pattern can be preservation with this method. Within this work it was found, that different Random Fields as basis for the Voronoi Tesselation lead to differently well fitted statistical properties of the glass breakage patterns. As the methodology is derived and kept general, the framework could be also applied to other random tesselations and crack pattern modelling purposes.

Keywords: glass breakage predicition, Voronoi Random Field Tessellation, fragmentation analysis, Bayesian parameter identification

Procedia PDF Downloads 131
564 Streamflow Modeling Using the PyTOPKAPI Model with Remotely Sensed Rainfall Data: A Case Study of Gilgel Ghibe Catchment, Ethiopia

Authors: Zeinu Ahmed Rabba, Derek D Stretch

Abstract:

Remote sensing contributes valuable information to streamflow estimates. Usually, stream flow is directly measured through ground-based hydrological monitoring station. However, in many developing countries like Ethiopia, ground-based hydrological monitoring networks are either sparse or nonexistent, which limits the manage water resources and hampers early flood-warning systems. In such cases, satellite remote sensing is an alternative means to acquire such information. This paper discusses the application of remotely sensed rainfall data for streamflow modeling in Gilgel Ghibe basin in Ethiopia. Ten years (2001-2010) of two satellite-based precipitation products (SBPP), TRMM and WaterBase, were used. These products were combined with the PyTOPKAPI hydrological model to generate daily stream flows. The results were compared with streamflow observations at Gilgel Ghibe Nr, Assendabo gauging station using four statistical tools (Bias, R², NS and RMSE). The statistical analysis indicates that the bias-adjusted SBPPs agree well with gauged rainfall compared to bias-unadjusted ones. The SBPPs with no bias-adjustment tend to overestimate (high Bias and high RMSE) the extreme precipitation events and the corresponding simulated streamflow outputs, particularly during wet months (June-September) and underestimate the streamflow prediction over few dry months (January and February). This shows that bias-adjustment can be important for improving the performance of the SBPPs in streamflow forecasting. We further conclude that the general streamflow patterns were well captured at daily time scales when using SBPPs after bias adjustment. However, the overall results demonstrate that the simulated streamflow using the gauged rainfall is superior to those obtained from remotely sensed rainfall products including bias-adjusted ones.

Keywords: Ethiopia, PyTOPKAPI model, remote sensing, streamflow, Tropical Rainfall Measuring Mission (TRMM), waterBase

Procedia PDF Downloads 249
563 Unlocking the Puzzle of Borrowing Adult Data for Designing Hybrid Pediatric Clinical Trials

Authors: Rajesh Kumar G

Abstract:

A challenging aspect of any clinical trial is to carefully plan the study design to meet the study objective in optimum way and to validate the assumptions made during protocol designing. And when it is a pediatric study, there is the added challenge of stringent guidelines and difficulty in recruiting the necessary subjects. Unlike adult trials, there is not much historical data available for pediatrics, which is required to validate assumptions for planning pediatric trials. Typically, pediatric studies are initiated as soon as approval is obtained for a drug to be marketed for adults, so with the adult study historical information and with the available pediatric pilot study data or simulated pediatric data, the pediatric study can be well planned. Generalizing the historical adult study for new pediatric study is a tedious task; however, it is possible by integrating various statistical techniques and utilizing the advantage of hybrid study design, which will help to achieve the study objective in a smoother way even with the presence of many constraints. This research paper will explain how well the hybrid study design can be planned along with integrated technique (SEV) to plan the pediatric study; In brief the SEV technique (Simulation, Estimation (using borrowed adult data and applying Bayesian methods)) incorporates the use of simulating the planned study data and getting the desired estimates to Validate the assumptions.This method of validation can be used to improve the accuracy of data analysis, ensuring that results are as valid and reliable as possible, which allow us to make informed decisions well ahead of study initiation. With professional precision, this technique based on the collected data allows to gain insight into best practices when using data from historical study and simulated data alike.

Keywords: adaptive design, simulation, borrowing data, bayesian model

Procedia PDF Downloads 36
562 Uncertainty Assessment in Building Energy Performance

Authors: Fally Titikpina, Abderafi Charki, Antoine Caucheteux, David Bigaud

Abstract:

The building sector is one of the largest energy consumer with about 40% of the final energy consumption in the European Union. Ensuring building energy performance is of scientific, technological and sociological matter. To assess a building energy performance, the consumption being predicted or estimated during the design stage is compared with the measured consumption when the building is operational. When valuing this performance, many buildings show significant differences between the calculated and measured consumption. In order to assess the performance accurately and ensure the thermal efficiency of the building, it is necessary to evaluate the uncertainties involved not only in measurement but also those induced by the propagation of dynamic and static input data in the model being used. The evaluation of measurement uncertainty is based on both the knowledge about the measurement process and the input quantities which influence the result of measurement. Measurement uncertainty can be evaluated within the framework of conventional statistics presented in the \textit{Guide to the Expression of Measurement Uncertainty (GUM)} as well as by Bayesian Statistical Theory (BST). Another choice is the use of numerical methods like Monte Carlo Simulation (MCS). In this paper, we proposed to evaluate the uncertainty associated to the use of a simplified model for the estimation of the energy consumption of a given building. A detailed review and discussion of these three approaches (GUM, MCS and BST) is given. Therefore, an office building has been monitored and multiple sensors have been mounted on candidate locations to get required data. The monitored zone is composed of six offices and has an overall surface of 102 $m^2$. Temperature data, electrical and heating consumption, windows opening and occupancy rate are the features for our research work.

Keywords: building energy performance, uncertainty evaluation, GUM, bayesian approach, monte carlo method

Procedia PDF Downloads 430
561 Cost Benefit Analysis and Adjustments of Corporate Social Responsibility in the Airline Industry

Authors: Roman Asatryan

Abstract:

The decision-making processes in Corporate Social Responsibility (CSR) among firms in general and airlines in particular have to do with the benefits that accrue through those investments. The crux of the matter is not whether to invest in CSR or not, but rather, how firms can quantify the benefits derived from such investments. This paper analyzes the cost benefit adjustment strategies for firms in the airline industry in their CSR strategy adoption and implementation. The adjustment strategies identified will enable firms in the airline industry to have a basis for determining the worth of such CSR investments. This paper discusses the cost and benefit analysis model in order to understand the ways airlines can reduce costs and increase returns on CSR, or balance the cost and benefits. The analysis from this study points to the fact that economic concepts especially the CBA are useful, though they are not without challenges. The challenge arises when it is problematic to express the real impact of the externality in monetary terms. The use of rational maximization of the gains may seem to be a rather optimistic goal mainly because of environmental variability, perceptual uncertainty, and imperfect knowledge about the potential externality. This paper concludes that the CBA model gives a basic understanding of the motivations for investing in intangible assets like CSR. Consequently, it sets the tone for formulating relevant hypothesis in empirical studies in investment in CSR in particular and other intangible assets in business operations.

Keywords: cost-benefit analysis, corporate social responsibility, airline industry

Procedia PDF Downloads 358
560 Bayesian Inference of Physicochemical Quality Elements of Tropical Lagoon Nokoué (Benin)

Authors: Hounyèmè Romuald, Maxime Logez, Mama Daouda, Argillier Christine

Abstract:

In view of the very strong degradation of aquatic ecosystems, it is urgent to set up monitoring systems that are best able to report on the effects of the stresses they undergo. This is particularly true in developing countries, where specific and relevant quality standards and funding for monitoring programs are lacking. The objective of this study was to make a relevant and objective choice of physicochemical parameters informative of the main stressors occurring on African lakes and to identify their alteration thresholds. Based on statistical analyses of the relationship between several driving forces and the physicochemical parameters of the Nokoué lagoon, relevant Physico-chemical parameters were selected for its monitoring. An innovative method based on Bayesian statistical modeling was used. Eleven Physico-chemical parameters were selected for their response to at least one stressor and their threshold quality standards were also established: Total Phosphorus (<4.5mg/L), Orthophosphates (<0.2mg/L), Nitrates (<0.5 mg/L), TKN (<1.85 mg/L), Dry Organic Matter (<5 mg/L), Dissolved Oxygen (>4 mg/L), BOD (<11.6 mg/L), Salinity (7.6 .), Water Temperature (<28.7 °C), pH (>6.2), and Transparency (>0.9 m). According to the System for the Evaluation of Coastal Water Quality, these thresholds correspond to” good to medium” suitability classes, except for total phosphorus. One of the original features of this study is the use of the bounds of the credibility interval of the fixed-effect coefficients as local weathering standards for the characterization of the Physico-chemical status of this anthropized African ecosystem.

Keywords: driving forces, alteration thresholds, acadjas, monitoring, modeling, human activities

Procedia PDF Downloads 61
559 Modeling Spatio-Temporal Variation in Rainfall Using a Hierarchical Bayesian Regression Model

Authors: Sabyasachi Mukhopadhyay, Joseph Ogutu, Gundula Bartzke, Hans-Peter Piepho

Abstract:

Rainfall is a critical component of climate governing vegetation growth and production, forage availability and quality for herbivores. However, reliable rainfall measurements are not always available, making it necessary to predict rainfall values for particular locations through time. Predicting rainfall in space and time can be a complex and challenging task, especially where the rain gauge network is sparse and measurements are not recorded consistently for all rain gauges, leading to many missing values. Here, we develop a flexible Bayesian model for predicting rainfall in space and time and apply it to Narok County, situated in southwestern Kenya, using data collected at 23 rain gauges from 1965 to 2015. Narok County encompasses the Maasai Mara ecosystem, the northern-most section of the Mara-Serengeti ecosystem, famous for its diverse and abundant large mammal populations and spectacular migration of enormous herds of wildebeest, zebra and Thomson's gazelle. The model incorporates geographical and meteorological predictor variables, including elevation, distance to Lake Victoria and minimum temperature. We assess the efficiency of the model by comparing it empirically with the established Gaussian process, Kriging, simple linear and Bayesian linear models. We use the model to predict total monthly rainfall and its standard error for all 5 * 5 km grid cells in Narok County. Using the Monte Carlo integration method, we estimate seasonal and annual rainfall and their standard errors for 29 sub-regions in Narok. Finally, we use the predicted rainfall to predict large herbivore biomass in the Maasai Mara ecosystem on a 5 * 5 km grid for both the wet and dry seasons. We show that herbivore biomass increases with rainfall in both seasons. The model can handle data from a sparse network of observations with many missing values and performs at least as well as or better than four established and widely used models, on the Narok data set. The model produces rainfall predictions consistent with expectation and in good agreement with the blended station and satellite rainfall values. The predictions are precise enough for most practical purposes. The model is very general and applicable to other variables besides rainfall.

Keywords: non-stationary covariance function, gaussian process, ungulate biomass, MCMC, maasai mara ecosystem

Procedia PDF Downloads 257
558 A Robust Spatial Feature Extraction Method for Facial Expression Recognition

Authors: H. G. C. P. Dinesh, G. Tharshini, M. P. B. Ekanayake, G. M. R. I. Godaliyadda

Abstract:

This paper presents a new spatial feature extraction method based on principle component analysis (PCA) and Fisher Discernment Analysis (FDA) for facial expression recognition. It not only extracts reliable features for classification, but also reduces the feature space dimensions of pattern samples. In this method, first each gray scale image is considered in its entirety as the measurement matrix. Then, principle components (PCs) of row vectors of this matrix and variance of these row vectors along PCs are estimated. Therefore, this method would ensure the preservation of spatial information of the facial image. Afterwards, by incorporating the spectral information of the eigen-filters derived from the PCs, a feature vector was constructed, for a given image. Finally, FDA was used to define a set of basis in a reduced dimension subspace such that the optimal clustering is achieved. The method of FDA defines an inter-class scatter matrix and intra-class scatter matrix to enhance the compactness of each cluster while maximizing the distance between cluster marginal points. In order to matching the test image with the training set, a cosine similarity based Bayesian classification was used. The proposed method was tested on the Cohn-Kanade database and JAFFE database. It was observed that the proposed method which incorporates spatial information to construct an optimal feature space outperforms the standard PCA and FDA based methods.

Keywords: facial expression recognition, principle component analysis (PCA), fisher discernment analysis (FDA), eigen-filter, cosine similarity, bayesian classifier, f-measure

Procedia PDF Downloads 401
557 University Climate and Psychological Adjustment: African American Women’s Experiences at Predominantly White Institutions in the United States

Authors: Faheemah N. Mustafaa, Tamarie Macon, Tabbye Chavous

Abstract:

A major concern of university leaders worldwide is how to create environments where students from diverse racial/ethnic, national, and cultural backgrounds can thrive. Over the past decade or so in the United States, African American women have done exceedingly well in terms of college enrollment, academic performance, and completion. However, the relative academic successes of African American women in higher education has in some ways overshadowed social challenges many Black women continue to encounter on college campuses in the United States. Within predominantly White institutions (PWIs) in particular, there is consistent evidence that many Black students experience racially hostile climates. However, research studies on racial climates within PWIs have mostly focused on cross-sectional comparisons of minority and majority group experiences, and few studies have examined campus racial climate in relation to short- and longer-term well-being. One longitudinal study reported that African American women’s psychological well-being was positively related to their comfort in cross-racial interactions (a concept closely related to campus climate). Thus, our primary research question was: Do African American women’s perceptions of campus climate (tension and positive association) during their freshman year predict their reports of psychological distress and well-being (self-acceptance) during their sophomore year? Participants were part of a longitudinal survey examining African American college students’ academic identity development, particularly in Science, Technology, Engineering, and Mathematics (STEM) fields. The final subsample included 134 self-identified African American/Black women enrolled in PWIs. Accounting for background characteristics (mother’s education, family income, interracial contact, and prior levels of outcomes), we employed hierarchical regression to examine relationships between campus racial climate during freshman year and psychological adjustment one year later. Both regression models significantly predicted African American women’s psychological outcomes (for distress, F(7,91)= 4.34, p < .001; and for self-acceptance, F(7,90)= 4.92, p < .001). Although none of the controls were significant predictors, perceptions of racial tension on campus were associated with both distress and self-acceptance. More perceptions of tension were related to African American women’s greater psychological distress the following year (B= 0.22, p= .01). Additionally, racial tension predicted later self-acceptance in the expected direction: Higher first-year reports of racial tension were related to less positive attitudes toward the self during the sophomore year (B= -0.16, p= .04). However, perceptions that it was normative for Black and White students to socialize on campus (or positive association scores) were unrelated to psychological distress or self-acceptance. Findings highlight the relevance of examining multiple facets of campus racial climate in relation to psychological adjustment, with possible emphasis on the import of racial tension on African American women’s psychological adjustment. Results suggest that negative dimensions of campus racial climate may have lingering effects on psychological well-being, over and above more positive aspects of climate. Thus, programs targeted toward improving student relations on campus should consider addressing cross-racial tensions.

Keywords: higher education, psychological adjustment, university climate, university students

Procedia PDF Downloads 359
556 Hope as a Predictor for Complicated Grief and Anxiety: A Bayesian Structural Equational Modeling Study

Authors: Bo Yan, Amy Y. M. Chow

Abstract:

Bereavement is recognized as a universal challenging experience. It is important to gather research evidence on protective factors in bereavement. Hope is considered as one of the protective factors in previous coping studies. The present study aims to add knowledge by investigating hope at the first month after death to predict psychological symptoms altogether including complicated grief (CG), anxiety, and depressive symptoms at the seventh month. The data were collected via one-on-one interview survey in a longitudinal project with Hong Kong hospice users (sample size 105). Most participants were at their middle age (49-year-old on average), female (72%), with no religious affiliation (58%). Bayesian Structural Equation Modeling (BSEM) analysis was conducted on the longitudinal dataset. The BSEM findings show that hope at the first month of bereavement negatively predicts both CG and anxiety symptoms at the seventh month but not for depressive symptoms. Age and gender are controlled in the model. The overall model fit is good. The current study findings suggest assessing hope at the first month of bereavement. Hope at the first month after the loss is identified as an excellent predictor for complicated grief and anxiety symptoms at the seventh month. The result from this sample is clear, so it encourages cross-cultural research on replicated modeling and development of further clinical application. Particularly, practical consideration for early intervention to increase the level of hope has the potential to reduce the psychological symptoms and thus to improve the bereaved persons’ wellbeing in the long run.

Keywords: anxiety, complicated grief, depressive symptoms, hope, structural equational modeling

Procedia PDF Downloads 166
555 Oryzanol Recovery from Rice Bran Oil: Adsorption Equilibrium Models Through Kinetics Data Approachments

Authors: A.D. Susanti, W. B. Sediawan, S.K. Wirawan, Budhijanto, Ritmaleni

Abstract:

Oryzanol content in rice bran oil (RBO) naturally has high antioxidant activity. Its reviewed has several health properties and high interested in pharmacy, cosmetics, and nutrition’s. Because of the low concentration of oryzanol in crude RBO (0.9-2.9%) then its need to be further processed for practical usage, such as via adsorption process. In this study, investigation and adjustment of adsorption equilibrium models were conducted through the kinetic data approachments. Mathematical modeling on kinetics of batch adsorption of oryzanol separation from RBO has been set-up and then applied for equilibrium results. The size of adsorbent particles used in this case are usually relatively small then the concentration in the adsorbent is assumed to be not different. Hence, the adsorption rate is controlled by the rate of oryzanol mass transfer from the bulk fluid of RBO to the surface of silica gel. In this approachments, the rate of mass transfer is assumed to be proportional to the concentration deviation from the equilibrium state. The equilibrium models applied were Langmuir, coefficient distribution, and Freundlich with the values of the parameters obtained from equilibrium results. It turned out that the models set-up can quantitatively describe the experimental kinetics data and the adjustment of the values of equilibrium isotherm parameters significantly improves the accuracy of the model. And then the value of mass transfer coefficient per unit adsorbent mass (kca) is obtained by curve fitting.

Keywords: adsorption equilibrium, adsorption kinetics, oryzanol, rice bran oil

Procedia PDF Downloads 295
554 From Responses of Macroinvertebrate Metrics to the Definition of Reference Thresholds

Authors: Hounyèmè Romuald, Mama Daouda, Argillier Christine

Abstract:

The present study focused on the use of benthic macrofauna to define the reference state of an anthropized lagoon (Nokoué-Benin) from the responses of relevant metrics to proxies. The approach used is a combination of a joint species distribution model and Bayesian networks. The joint species distribution model was used to select the relevant metrics and generate posterior probabilities that were then converted into posterior response probabilities for each of the quality classes (pressure levels), which will constitute the conditional probability tables allowing the establishment of the probabilistic graph representing the different causal relationships between metrics and pressure proxies. For the definition of the reference thresholds, the predicted responses for low-pressure levels were read via probability density diagrams. Observations collected during high and low water periods spanning 03 consecutive years (2004-2006), sampling 33 macroinvertebrate taxa present at all seasons and sampling points, and measurements of 14 environmental parameters were used as application data. The study demonstrated reliable inferences, selection of 07 relevant metrics and definition of quality thresholds for each environmental parameter. The relevance of the metrics as well as the reference thresholds for ecological assessment despite the small sample size, suggests the potential for wider applicability of the approach for aquatic ecosystem monitoring and assessment programs in developing countries generally characterized by a lack of monitoring data.

Keywords: pressure proxies, bayesian inference, bioindicators, acadjas, functional traits

Procedia PDF Downloads 55