Search results for: Bayesian information criterion
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11199

Search results for: Bayesian information criterion

10989 Hybrid Structure Learning Approach for Assessing the Phosphate Laundries Impact

Authors: Emna Benmohamed, Hela Ltifi, Mounir Ben Ayed

Abstract:

Bayesian Network (BN) is one of the most efficient classification methods. It is widely used in several fields (i.e., medical diagnostics, risk analysis, bioinformatics research). The BN is defined as a probabilistic graphical model that represents a formalism for reasoning under uncertainty. This classification method has a high-performance rate in the extraction of new knowledge from data. The construction of this model consists of two phases for structure learning and parameter learning. For solving this problem, the K2 algorithm is one of the representative data-driven algorithms, which is based on score and search approach. In addition, the integration of the expert's knowledge in the structure learning process allows the obtainment of the highest accuracy. In this paper, we propose a hybrid approach combining the improvement of the K2 algorithm called K2 algorithm for Parents and Children search (K2PC) and the expert-driven method for learning the structure of BN. The evaluation of the experimental results, using the well-known benchmarks, proves that our K2PC algorithm has better performance in terms of correct structure detection. The real application of our model shows its efficiency in the analysis of the phosphate laundry effluents' impact on the watershed in the Gafsa area (southwestern Tunisia).

Keywords: Bayesian network, classification, expert knowledge, structure learning, surface water analysis

Procedia PDF Downloads 93
10988 An Criterion to Minimize FE Mesh-Dependency in Concrete Plate Subjected to Impact Loading

Authors: Kwak, Hyo-Gyung, Gang, Han Gul

Abstract:

In the context of an increasing need for reliability and safety in concrete structures under blast and impact loading condition, the behavior of concrete under high strain rate condition has been an important issue. Since concrete subjected to impact loading associated with high strain rate shows quite different material behavior from that in the static state, several material models are proposed and used to describe the high strain rate behavior under blast and impact loading. In the process of modelling, in advance, mesh dependency in the used finite element (FE) is the key problem because simulation results under high strain-rate condition are quite sensitive to applied FE mesh size. It means that the accuracy of simulation results may deeply be dependent on FE mesh size in simulations. This paper introduces an improved criterion which can minimize the mesh-dependency of simulation results on the basis of the fracture energy concept, and HJC (Holmquist Johnson Cook), CSC (Continuous Surface Cap) and K&C (Karagozian & Case) models are examined to trace their relative sensitivity to the used FE mesh size. To coincide with the purpose of the penetration test with a concrete plate under a projectile (bullet), the residual velocities of projectile after penetration are compared. The correlation studies between analytical results and the parametric studies associated with them show that the variation of residual velocity with the used FE mesh size is quite reduced by applying a unique failure strain value determined according to the proposed criterion.

Keywords: high strain rate concrete, penetration simulation, failure strain, mesh-dependency, fracture energy

Procedia PDF Downloads 497
10987 Method of Synthesis of Controlled Generators Balanced a Strictly Avalanche Criteria-Functions

Authors: Ali Khwaldeh, Nimer Adwan

Abstract:

In this paper, a method for constructing a controlled balanced Boolean function satisfying the criterion of a Strictly Avalanche Criteria (SAC) effect is proposed. The proposed method is based on the use of three orthogonal nonlinear components which is unlike the high-order SAC functions. So, the generator synthesized by the proposed method has separate sets of control and information inputs. The proposed method proves its simplicity and the implementation ability. The proposed method allows synthesizing a SAC function generator with fixed control and information inputs. This ensures greater efficiency of the built-in oscillator compared to high-order SAC functions that can be used as a generator. Accordingly, the method is completely formalized and implemented as a software product.

Keywords: boolean function, controlled balanced boolean function, strictly avalanche criteria, orthogonal nonlinear

Procedia PDF Downloads 130
10986 Estimating the Potential of Solar Energy: A Moroccan Case Study

Authors: Fakhreddin El Wali Elalaoui, Maatouk Mustapha

Abstract:

The problem of global climate change isbecoming more and more serious. Therefore, there is a growing interest in renewable energy sources to minimize the impact of this phenomenon. Environmental policies are changing in different countries, including Morocco, with a greater focus on the integration and development of renewable energy projects. The purpose of this paper is to evaluate the potential of solar power plants in Morocco based on two technologies: concentrated solar power (CSP) and photovoltaics (PV). In order to perform an accurate search, we must follow a certain method to select the correct criteria. Four selection criteria were retained: climate, topography, location, and water resources. AnalyticHierarchy Process (AHP) was used to calculate the weight/importance of each criterion. Once obtained, weights are applied to the map for each criterion to produce a final ranking that ranks regions according to their potential. The results show that Morocco has strong potential for both technologies, especially in the southern region. Finally, this work is the first in the field to include the whole of Morocco in the study area.

Keywords: PV, Csp, solar energy, GIS

Procedia PDF Downloads 59
10985 Confidence Envelopes for Parametric Model Selection Inference and Post-Model Selection Inference

Authors: I. M. L. Nadeesha Jayaweera, Adao Alex Trindade

Abstract:

In choosing a candidate model in likelihood-based modeling via an information criterion, the practitioner is often faced with the difficult task of deciding just how far up the ranked list to look. Motivated by this pragmatic necessity, we construct an uncertainty band for a generalized (model selection) information criterion (GIC), defined as a criterion for which the limit in probability is identical to that of the normalized log-likelihood. This includes common special cases such as AIC & BIC. The method starts from the asymptotic normality of the GIC for the joint distribution of the candidate models in an independent and identically distributed (IID) data framework and proceeds by deriving the (asymptotically) exact distribution of the minimum. The calculation of an upper quantile for its distribution then involves the computation of multivariate Gaussian integrals, which is amenable to efficient implementation via the R package "mvtnorm". The performance of the methodology is tested on simulated data by checking the coverage probability of nominal upper quantiles and compared to the bootstrap. Both methods give coverages close to nominal for large samples, but the bootstrap is two orders of magnitude slower. The methodology is subsequently extended to two other commonly used model structures: regression and time series. In the regression case, we derive the corresponding asymptotically exact distribution of the minimum GIC invoking Lindeberg-Feller type conditions for triangular arrays and are thus able to similarly calculate upper quantiles for its distribution via multivariate Gaussian integration. The bootstrap once again provides a default competing procedure, and we find that similar comparison performance metrics hold as for the IID case. The time series case is complicated by far more intricate asymptotic regime for the joint distribution of the model GIC statistics. Under a Gaussian likelihood, the default in most packages, one needs to derive the limiting distribution of a normalized quadratic form for a realization from a stationary series. Under conditions on the process satisfied by ARMA models, a multivariate normal limit is once again achieved. The bootstrap can, however, be employed for its computation, whence we are once again in the multivariate Gaussian integration paradigm for upper quantile evaluation. Comparisons of this bootstrap-aided semi-exact method with the full-blown bootstrap once again reveal a similar performance but faster computation speeds. One of the most difficult problems in contemporary statistical methodological research is to be able to account for the extra variability introduced by model selection uncertainty, the so-called post-model selection inference (PMSI). We explore ways in which the GIC uncertainty band can be inverted to make inferences on the parameters. This is being attempted in the IID case by pivoting the CDF of the asymptotically exact distribution of the minimum GIC. For inference one parameter at a time and a small number of candidate models, this works well, whence the attained PMSI confidence intervals are wider than the MLE-based Wald, as expected.

Keywords: model selection inference, generalized information criteria, post model selection, Asymptotic Theory

Procedia PDF Downloads 61
10984 The Discussion on the Composition of Feng Shui by the Environmental Planning Viewpoint

Authors: Jhuang Jin-Jhong, Hsieh Wei-Fan

Abstract:

Climate change causes natural disasters persistently. Therefore, nowadays environmental planning objective tends to the issues of respecting nature and coexisting with nature. As a result, the natural environment analysis, e.g., the analysis of topography, soil, hydrology, climate, vegetation, is highly emphasized. On the other hand, Feng Shui has been a criterion of site selection for residence in Eastern since the ancient times and has had farther influence on site selection for castles and even for temples and tombs. The primary criterion of site selection is judging the quality of Long: mountain range, Sha: nearby mountains, Shui: hydrology, Xue: foundation, Xiang: aspect, which are similar to the environmental variables of mountain range, topography, hydrology and aspect. For the reason, a lot researchers attempt to probe into the connection between the criterion of Feng Shui and environmental planning factors. Most researches only discussed with the composition and theory of space of Feng Shui, but there is no research which explained Feng Shui through the environmental field. Consequently, this study reviewed the theory of Feng Shui through the environmental planning viewpoint and assembled essential composition factors of Feng Shui. The results of this study point. From literature review and comparison of theoretical meanings, we find that the ideal principles for planning the Feng Shui environment can also be used for environmental planning. Therefore, this article uses 12 ideal environmental features used in Feng Shui to contrast the natural aspects of the environment and make comparisons with previous research and classifies the environmental factors into climate, topography, hydrology, vegetation, and soil.

Keywords: the composition of Feng Shui, environmental planning, site selection, main components of the Feng Shui environment

Procedia PDF Downloads 483
10983 Metacognitive Processing in Early Readers: The Role of Metacognition in Monitoring Linguistic and Non-Linguistic Performance and Regulating Students' Learning

Authors: Ioanna Taouki, Marie Lallier, David Soto

Abstract:

Metacognition refers to the capacity to reflect upon our own cognitive processes. Although there is an ongoing discussion in the literature on the role of metacognition in learning and academic achievement, little is known about its neurodevelopmental trajectories in early childhood, when children begin to receive formal education in reading. Here, we evaluate the metacognitive ability, estimated under a recently developed Signal Detection Theory model, of a cohort of children aged between 6 and 7 (N=60), who performed three two-alternative-forced-choice tasks (two linguistic: lexical decision task, visual attention span task, and one non-linguistic: emotion recognition task) including trial-by-trial confidence judgements. Our study has three aims. First, we investigated how metacognitive ability (i.e., how confidence ratings track accuracy in the task) relates to performance in general standardized tasks related to students' reading and general cognitive abilities using Spearman's and Bayesian correlation analysis. Second, we assessed whether or not young children recruit common mechanisms supporting metacognition across the different task domains or whether there is evidence for domain-specific metacognition at this early stage of development. This was done by examining correlations in metacognitive measures across different task domains and evaluating cross-task covariance by applying a hierarchical Bayesian model. Third, using robust linear regression and Bayesian regression models, we assessed whether metacognitive ability in this early stage is related to the longitudinal learning of children in a linguistic and a non-linguistic task. Notably, we did not observe any association between students’ reading skills and metacognitive processing in this early stage of reading acquisition. Some evidence consistent with domain-general metacognition was found, with significant positive correlations between metacognitive efficiency between lexical and emotion recognition tasks and substantial covariance indicated by the Bayesian model. However, no reliable correlations were found between metacognitive performance in the visual attention span and the remaining tasks. Remarkably, metacognitive ability significantly predicted children's learning in linguistic and non-linguistic domains a year later. These results suggest that metacognitive skill may be dissociated to some extent from general (i.e., language and attention) abilities and further stress the importance of creating educational programs that foster students’ metacognitive ability as a tool for long term learning. More research is crucial to understand whether these programs can enhance metacognitive ability as a transferable skill across distinct domains or whether unique domains should be targeted separately.

Keywords: confidence ratings, development, metacognitive efficiency, reading acquisition

Procedia PDF Downloads 118
10982 Measurement of IMRT Dose Distribution in Rando Head and Neck Phantom using EBT3 Film

Authors: Pegah Safavi, Mehdi Zehtabian, Mohammad Amin Mosleh-Shirazi

Abstract:

Cancer is one of the leading causes of death in the world. Radiation therapy is one of the main choices for cancer treatment. Intensity-modulated radiation therapy is a new type of radiation therapy technique available for vital structures such as the parathyroid glands. It is very important to check the accuracy of the delivered IMRT treatment because any mistake may lead to more complications for the patient. This paper describes an experiment to determine the accuracy of a dose measured by EBT3 film. To test this method, the EBT3 film on the head and neck of the Rando phantom was irradiated by an IMRT device and the irradiation was repeated twice. Finally, the dose designed by the irradiation system was compared with the dose measured by the EBT3 film. Using this criterion, the accuracy of the EBT3 film was evaluated. When using this criterion, a 95% agreement was reached between the planned treatment and the measured values.

Keywords: EBT3, phantom, accuracy, cancer, IMRT

Procedia PDF Downloads 118
10981 An Inverse Optimal Control Approach for the Nonlinear System Design Using ANN

Authors: M. P. Nanda Kumar, K. Dheeraj

Abstract:

The design of a feedback controller, so as to minimize a given performance criterion, for a general non-linear dynamical system is difficult; if not impossible. But for a large class of non-linear dynamical systems, the open loop control that minimizes a performance criterion can be obtained using calculus of variations and Pontryagin’s minimum principle. In this paper, the open loop optimal trajectories, that minimizes a given performance measure, is used to train the neural network whose inputs are state variables of non-linear dynamical systems and the open loop optimal control as the desired output. This trained neural network is used as the feedback controller. In other words, attempts are made here to solve the “inverse optimal control problem” by using the state and control trajectories that are optimal in an open loop sense.

Keywords: inverse optimal control, radial basis function, neural network, controller design

Procedia PDF Downloads 526
10980 A Taxonomy Proposal on Criterion Structure for Evaluating Freight Village Concepts in Early-Stage Design Projects

Authors: Rıza Gürhan Korkut, Metin Çelik, Süleyman Özkaynak

Abstract:

The early-stage design and development projects for the freight village initiatives require a comprehensive analysis of both qualitative and quantitative data. Considering the literature review on structural and operational management requirements, this study proposed an original taxonomy on criterion structure to assess freight village conceptualization. The potential challenges and uncertainties of the developed taxonomy are extended. Besides requirement analysis, this study is also expected to contribute to forthcoming research on benchmarking of freight villages in different regions. The methodology used in this research is a systematic review on several articles as per their modelling approaches, sustainability, entities and decisions made together with the uncertainties and features of their models taken into consideration. The major findings of the study that are the categories for assessing the projects attributes on their environmental, socio-economical, accessibility and location aspects.

Keywords: logistics centers, freight village, operational management, taxonomy

Procedia PDF Downloads 153
10979 Regular or Irregular: An Investigation of Medicine Consumption Pattern with Poisson Mixture Model

Authors: Lichung Jen, Yi Chun Liu, Kuan-Wei Lee

Abstract:

Fruitful data has been accumulated in database nowadays and is commonly used as support for decision-making. In the healthcare industry, hospital, for instance, ordering pharmacy inventory is one of the key decision. With large drug inventory, the current cost increases and its expiration dates might lead to future issue, such as drug disposal and recycle. In contrast, underestimating demand of the pharmacy inventory, particularly standing drugs, affects the medical treatment and possibly hospital reputation. Prescription behaviour of hospital physicians is one of the critical factor influencing this decision, particularly irregular prescription behaviour. If a drug’s usage amount in the month is irregular and less than the regular usage, it may cause the trend of subsequent stockpiling. On the contrary, if a drug has been prescribed often than expected, it may result in insufficient inventory. We proposed a hierarchical Bayesian mixture model with two components to identify physicians’ regular/irregular prescription patterns with probabilities. Heterogeneity of hospital is considered in our proposed hierarchical Bayes model. The result suggested that modeling the prescription patterns of physician is beneficial for estimating the order quantity of medication and pharmacy inventory management of the hospital. Managerial implication and future research are discussed.

Keywords: hierarchical Bayesian model, poission mixture model, medicines prescription behavior, irregular behavior

Procedia PDF Downloads 98
10978 Use of Analytic Hierarchy Process for Plant Site Selection

Authors: Muzaffar Shaikh, Shoaib Shaikh, Mark Moyou, Gaby Hawat

Abstract:

This paper presents the use of Analytic Hierarchy Process (AHP) in evaluating the site selection of a new plant by a corporation. Due to intense competition at a global level, multinational corporations are continuously striving to minimize production and shipping costs of their products. One key factor that plays significant role in cost minimization is where the production plant is located. In the U.S. for example, labor and land costs continue to be very high while they are much cheaper in countries such as India, China, Indonesia, etc. This is why many multinational U.S. corporations (e.g. General Electric, Caterpillar Inc., Ford, General Motors, etc.), have shifted their manufacturing plants outside. The continued expansion of the Internet and its availability along with technological advances in computer hardware and software all around the globe have facilitated U.S. corporations to expand abroad as they seek to reduce production cost. In particular, management of multinational corporations is constantly engaged in concentrating on countries at a broad level, or cities within specific countries where certain or all parts of their end products or the end products themselves can be manufactured cheaper than in the U.S. AHP is based on preference ratings of a specific decision maker who can be the Chief Operating Officer of a company or his/her designated data analytics engineer. It serves as a tool to first evaluate the plant site selection criteria and second, alternate plant sites themselves against these criteria in a systematic manner. Examples of site selection criteria are: Transportation Modes, Taxes, Energy Modes, Labor Force Availability, Labor Rates, Raw Material Availability, Political Stability, Land Costs, etc. As a necessary first step under AHP, evaluation criteria and alternate plant site countries are identified. Depending upon the fidelity of analysis, specific cities within a country can also be chosen as alternative facility locations. AHP experience in this type of analysis indicates that the initial analysis can be performed at the Country-level. Once a specific country is chosen via AHP, secondary analyses can be performed by selecting specific cities or counties within a country. AHP analysis is usually based on preferred ratings of a decision-maker (e.g., 1 to 5, 1 to 7, or 1 to 9, etc., where 1 means least preferred and a 5 means most preferred). The decision-maker assigns preferred ratings first, criterion vs. criterion and creates a Criteria Matrix. Next, he/she assigns preference ratings by alternative vs. alternative against each criterion. Once this data is collected, AHP is applied to first get the rank-ordering of criteria. Next, rank-ordering of alternatives is done against each criterion resulting in an Alternative Matrix. Finally, overall rank ordering of alternative facility locations is obtained by matrix multiplication of Alternative Matrix and Criteria Matrix. The most practical aspect of AHP is the ‘what if’ analysis that the decision-maker can conduct after the initial results to provide valuable sensitivity information of specific criteria to other criteria and alternatives.

Keywords: analytic hierarchy process, multinational corporations, plant site selection, preference ratings

Procedia PDF Downloads 260
10977 Joint Modeling of Longitudinal and Time-To-Event Data with Latent Variable

Authors: Xinyuan Y. Song, Kai Kang

Abstract:

Joint models for analyzing longitudinal and survival data are widely used to investigate the relationship between a failure time process and time-variant predictors. A common assumption in conventional joint models in the survival analysis literature is that all predictors are observable. However, this assumption may not always be supported because unobservable traits, namely, latent variables, which are indirectly observable and should be measured through multiple observed variables, are commonly encountered in the medical, behavioral, and financial research settings. In this study, a joint modeling approach to deal with this feature is proposed. The proposed model comprises three parts. The first part is a dynamic factor analysis model for characterizing latent variables through multiple observed indicators over time. The second part is a random coefficient trajectory model for describing the individual trajectories of latent variables. The third part is a proportional hazard model for examining the effects of time-invariant predictors and the longitudinal trajectories of time-variant latent risk factors on hazards of interest. A Bayesian approach coupled with a Markov chain Monte Carlo algorithm to perform statistical inference. An application of the proposed joint model to a study on the Alzheimer's disease neuroimaging Initiative is presented.

Keywords: Bayesian analysis, joint model, longitudinal data, time-to-event data

Procedia PDF Downloads 112
10976 Developing Heat-Power Efficiency Criteria for Characterization of Technosphere Structural Elements

Authors: Victoria Y. Garnova, Vladimir G. Merzlikin, Sergey V. Khudyakov, Aleksandr A. Gajour, Andrei P. Garnov

Abstract:

This paper refers to the analysis of the characteristics of industrial and lifestyle facilities heat- energy objects as a part of the thermal envelope of Earth's surface for inclusion in any database of economic forecasting. The idealized model of the Earth's surface is discussed. This model gives the opportunity to obtain the energy equivalent for each element of terrain and world ocean. Energy efficiency criterion of comfortable human existence is introduced. Dynamics of changes of this criterion offers the possibility to simulate the possible technogenic catastrophes with a spontaneous industrial development of the certain Earth areas. Calculated model with the confirmed forecast of the Gulf Stream freezing in the Polar Regions in 2011 due to the heat-energy balance disturbance for the oceanic subsurface oil polluted layer is given. Two opposing trends of human development under the limited and unlimited amount of heat-energy resources are analyzed.

Keywords: Earth's surface, heat-energy consumption, energy criteria, technogenic catastrophes

Procedia PDF Downloads 289
10975 Choosing between the Regression Correlation, the Rank Correlation, and the Correlation Curve

Authors: Roger L. Goodwin

Abstract:

This paper presents a rank correlation curve. The traditional correlation coefficient is valid for both continuous variables and for integer variables using rank statistics. Since the correlation coefficient has already been established in rank statistics by Spearman, such a calculation can be extended to the correlation curve. This paper presents two survey questions. The survey collected non-continuous variables. We will show weak to moderate correlation. Obviously, one question has a negative effect on the other. A review of the qualitative literature can answer which question and why. The rank correlation curve shows which collection of responses has a positive slope and which collection of responses has a negative slope. Such information is unavailable from the flat, "first-glance" correlation statistics.

Keywords: Bayesian estimation, regression model, rank statistics, correlation, correlation curve

Procedia PDF Downloads 429
10974 The System of Uniform Criteria for the Characterization and Evaluation of Elements of Economic Structure: The Territory, Infrastructure, Processes, Technological Chains, the End Products

Authors: Aleksandr A. Gajour, Vladimir G. Merzlikin, Vladimir I. Veselov

Abstract:

This paper refers to the analysis of the characteristics of industrial and lifestyle facilities heat- energy objects as a part of the thermal envelope of Earth's surface for inclusion in any database of economic forecasting. The idealized model of the Earth's surface is discussed. This model gives the opportunity to obtain the energy equivalent for each element of terrain and world ocean. Energy efficiency criterion of comfortable human existence is introduced. Dynamics of changes of this criterion offers the possibility to simulate the possible technogenic catastrophes with the spontaneous industrial development of the certain Earth areas. Calculated model with the confirmed forecast of the Gulf Stream freezing in the polar regions in 2011 due to the heat-energy balance disturbance for the oceanic subsurface oil polluted layer is given. Two opposing trends of human development under limited and unlimited amount of heat-energy resources are analyzed.

Keywords: Earth's surface, heat-energy consumption, energy criteria, technogenic catastrophes

Procedia PDF Downloads 363
10973 Bayesian Inference for High Dimensional Dynamic Spatio-Temporal Models

Authors: Sofia M. Karadimitriou, Kostas Triantafyllopoulos, Timothy Heaton

Abstract:

Reduced dimension Dynamic Spatio-Temporal Models (DSTMs) jointly describe the spatial and temporal evolution of a function observed subject to noise. A basic state space model is adopted for the discrete temporal variation, while a continuous autoregressive structure describes the continuous spatial evolution. Application of such a DSTM relies upon the pre-selection of a suitable reduced set of basic functions and this can present a challenge in practice. In this talk, we propose an online estimation method for high dimensional spatio-temporal data based upon DSTM and we attempt to resolve this issue by allowing the basis to adapt to the observed data. Specifically, we present a wavelet decomposition in order to obtain a parsimonious approximation of the spatial continuous process. This parsimony can be achieved by placing a Laplace prior distribution on the wavelet coefficients. The aim of using the Laplace prior, is to filter wavelet coefficients with low contribution, and thus achieve the dimension reduction with significant computation savings. We then propose a Hierarchical Bayesian State Space model, for the estimation of which we offer an appropriate particle filter algorithm. The proposed methodology is illustrated using real environmental data.

Keywords: multidimensional Laplace prior, particle filtering, spatio-temporal modelling, wavelets

Procedia PDF Downloads 400
10972 Data Driven Infrastructure Planning for Offshore Wind farms

Authors: Isha Saxena, Behzad Kazemtabrizi, Matthias C. M. Troffaes, Christopher Crabtree

Abstract:

The calculations done at the beginning of the life of a wind farm are rarely reliable, which makes it important to conduct research and study the failure and repair rates of the wind turbines under various conditions. This miscalculation happens because the current models make a simplifying assumption that the failure/repair rate remains constant over time. This means that the reliability function is exponential in nature. This research aims to create a more accurate model using sensory data and a data-driven approach. The data cleaning and data processing is done by comparing the Power Curve data of the wind turbines with SCADA data. This is then converted to times to repair and times to failure timeseries data. Several different mathematical functions are fitted to the times to failure and times to repair data of the wind turbine components using Maximum Likelihood Estimation and the Posterior expectation method for Bayesian Parameter Estimation. Initial results indicate that two parameter Weibull function and exponential function produce almost identical results. Further analysis is being done using the complex system analysis considering the failures of each electrical and mechanical component of the wind turbine. The aim of this project is to perform a more accurate reliability analysis that can be helpful for the engineers to schedule maintenance and repairs to decrease the downtime of the turbine.

Keywords: reliability, bayesian parameter inference, maximum likelihood estimation, weibull function, SCADA data

Procedia PDF Downloads 31
10971 A Two Tailed Secretary Problem with Multiple Criteria

Authors: Alaka Padhye, S. P. Kane

Abstract:

The following study considers some variations made to the secretary problem (SP). In a multiple criteria secretary problem (MCSP), the selection of a unit is based on two independent characteristics. The units that appear before an observer are known say N, the best rank of a unit being N. A unit is selected, if it is better with respect to either first or second or both the characteristics. When the number of units is large and due to constraints like time and cost, the observer might want to stop earlier instead of inspecting all the available units. Let the process terminate at r2th unit where r1Keywords: joint distribution, marginal distribution, real ranks, secretary problem, selection criterion, two tailed secretary problem

Procedia PDF Downloads 251
10970 Detecting and Thwarting Interest Flooding Attack in Information Centric Network

Authors: Vimala Rani P, Narasimha Malikarjunan, Mercy Shalinie S

Abstract:

Data Networking was brought forth as an instantiation of information-centric networking. The attackers can send a colossal number of spoofs to take hold of the Pending Interest Table (PIT) named an Interest Flooding attack (IFA) since the in- interests are recorded in the PITs of the intermediate routers until they receive corresponding Data Packets are go beyond the time limit. These attacks can be detrimental to network performance. PIT expiration rate or the Interest satisfaction rate, which cannot differentiate the IFA from attacks, is the criterion Traditional IFA detection techniques are concerned with. Threshold values can casually affect Threshold-based traditional methods. This article proposes an accurate IFA detection mechanism based on a Multiple Feature-based Extreme Learning Machine (MF-ELM). Accuracy of the attack detection can be increased by presenting the entropy of Internet names, Interest satisfaction rate and PIT usage as features extracted in the MF-ELM classifier. Furthermore, we deploy a queue-based hostile Interest prefix mitigation mechanism. The inference of this real-time test bed is that the mechanism can help the network to resist IFA with higher accuracy and efficiency.

Keywords: information-centric network, pending interest table, interest flooding attack, MF-ELM classifier, queue-based mitigation strategy

Procedia PDF Downloads 177
10969 Deep Learning to Improve the 5G NR Uplink Control Channel

Authors: Ahmed Krobba, Meriem Touzene, Mohamed Debeyche

Abstract:

The wireless communications system (5G) will provide more diverse applications and higher quality services for users compared to the long-term evolution 4G (LTE). 5G uses a higher carrier frequency, which suffers from information loss in 5G coverage. Most 5G users often cannot obtain high-quality communications due to transmission channel noise and channel complexity. Physical Uplink Control Channel (PUCCH-NR: Physical Uplink Control Channel New Radio) plays a crucial role in 5G NR telecommunication technology, which is mainly used to transmit link control information uplink (UCI: Uplink Control Information. This study based of evaluating the performance of channel physical uplink control PUCCH-NR under low Signal-to-Noise Ratios with various antenna numbers reception. We propose the artificial intelligence approach based on deep neural networks (Deep Learning) to estimate the PUCCH-NR channel in comparison with this approach with different conventional methods such as least-square (LS) and minimum-mean-square-error (MMSE). To evaluate the channel performance we use the block error rate (BLER) as an evaluation criterion of the communication system. The results show that the deep neural networks method gives best performance compared with MMSE and LS

Keywords: 5G network, uplink (Uplink), PUCCH channel, NR-PUCCH channel, deep learning

Procedia PDF Downloads 24
10968 Spatial Suitability Assessment of Onshore Wind Systems Using the Analytic Hierarchy Process

Authors: Ayat-Allah Bouramdane

Abstract:

Since 2010, there have been sustained decreases in the unit costs of onshore wind energy and large increases in its deployment, varying widely across regions. In fact, the onshore wind production is affected by air density— because cold air is more dense and therefore more effective at producing wind power— and by wind speed—as wind turbines cannot operate in very low or extreme stormy winds. The wind speed is essentially affected by the surface friction or the roughness and other topographic features of the land, which slow down winds significantly over the continent. Hence, the identification of the most appropriate locations of onshore wind systems is crucial to maximize their energy output and therefore minimize their Levelized Cost of Electricity (LCOE). This study focuses on the preliminary assessment of onshore wind energy potential, in several areas in Morocco with a particular focus on the Dakhla city, by analyzing the diurnal and seasonal variability of wind speed for different hub heights, the frequency distribution of wind speed, the wind rose and the wind performance indicators such as wind power density, capacity factor, and LCOE. In addition to climate criterion, other criteria (i.e., topography, location, environment) were selected fromGeographic Referenced Information (GRI), reflecting different considerations. The impact of each criterion on the suitability map of onshore wind farms was identified using the Analytic Hierarchy Process (AHP). We find that the majority of suitable zones are located along the Atlantic Ocean and the Mediterranean Sea. We discuss the sensitivity of the onshore wind site suitability to different aspects such as the methodology—by comparing the Multi-Criteria Decision-Making (MCDM)-AHP results to the Mean-Variance Portfolio optimization framework—and the potential impact of climate change on this suitability map, and provide the final recommendations to the Moroccan energy strategy by analyzing if the actual Morocco's onshore wind installations are located within areas deemed suitable. This analysis may serve as a decision-making framework for cost-effective investment in onshore wind power in Morocco and to shape the future sustainable development of the Dakhla city.

Keywords: analytic hierarchy process (ahp), dakhla, geographic referenced information, morocco, multi-criteria decision-making, onshore wind, site suitability.

Procedia PDF Downloads 124
10967 Information Literacy: Concept and Importance

Authors: Gaurav Kumar

Abstract:

An information literate person is one who uses information effectively in all its forms. When presented with questions or problems, an information literate person would know what information to look for, how to search efficiently and be able to access relevant sources. In addition, an information literate person would have the ability to evaluate and select appropriate information sources and to use the information effectively and ethically to answer questions or solve problems. Information literacy has become an important element in higher education. The information literacy movement has internationally recognized standards and learning outcomes. The step-by-step process of achieving information literacy is particularly crucial in an era where knowledge could be disseminated through a variety of media. What is the relationship between information literacy as we define it in higher education and information literacy among non-academic populations? What forces will change how we think about the definition of information literacy in the future and how we will apply the definition in all environments?

Keywords: information literacy, human beings, visual media and computer network etc, information literacy

Procedia PDF Downloads 304
10966 Finite Element Analysis for Earing Prediction Incorporating the BBC2003 Material Model with Fully Implicit Integration Method: Derivation and Numerical Algorithm

Authors: Sajjad Izadpanah, Seyed Hadi Ghaderi, Morteza Sayah Irani, Mahdi Gerdooei

Abstract:

In this research work, a sophisticated yield criterion known as BBC2003, capable of describing planar anisotropic behaviors of aluminum alloy sheets, was integrated into the commercial finite element code ABAQUS/Standard via a user subroutine. The complete formulation of the implementation process using a fully implicit integration scheme, i.e., the classic backward Euler method, is presented, and relevant aspects of the yield criterion are introduced. In order to solve nonlinear differential and algebraic equations, the line-search algorithm was adopted in the user-defined material subroutine (UMAT) to expand the convergence domain of the iterative Newton-Raphson method. The developed subroutine was used to simulate a challenging computational problem with complex stress states, i.e., deep drawing of an anisotropic aluminum alloy AA3105. The accuracy and stability of the developed subroutine were confirmed by comparing the numerically predicted earing and thickness variation profiles with the experimental results, which showed an excellent agreement between numerical and experimental earing and thickness profiles. The integration of the BBC2003 yield criterion into ABAQUS/Standard represents a significant contribution to the field of computational mechanics and provides a useful tool for analyzing the mechanical behavior of anisotropic materials subjected to complex loading conditions.

Keywords: BBC2003 yield function, plastic anisotropy, fully implicit integration scheme, line search algorithm, explicit and implicit integration schemes

Procedia PDF Downloads 43
10965 Computational Identification of Signalling Pathways in Protein Interaction Networks

Authors: Angela U. Makolo, Temitayo A. Olagunju

Abstract:

The knowledge of signaling pathways is central to understanding the biological mechanisms of organisms since it has been identified that in eukaryotic organisms, the number of signaling pathways determines the number of ways the organism will react to external stimuli. Signaling pathways are studied using protein interaction networks constructed from protein-protein interaction data obtained using high throughput experimental procedures. However, these high throughput methods are known to produce very high rates of false positive and negative interactions. In order to construct a useful protein interaction network from this noisy data, computational methods are applied to validate the protein-protein interactions. In this study, a computational technique to identify signaling pathways from a protein interaction network constructed using validated protein-protein interaction data was designed. A weighted interaction graph of the Saccharomyces cerevisiae (Baker’s Yeast) organism using the proteins as the nodes and interactions between them as edges was constructed. The weights were obtained using Bayesian probabilistic network to estimate the posterior probability of interaction between two proteins given the gene expression measurement as biological evidence. Only interactions above a threshold were accepted for the network model. A pathway was formalized as a simple path in the interaction network from a starting protein and an ending protein of interest. We were able to identify some pathway segments, one of which is a segment of the pathway that signals the start of the process of meiosis in S. cerevisiae.

Keywords: Bayesian networks, protein interaction networks, Saccharomyces cerevisiae, signalling pathways

Procedia PDF Downloads 508
10964 Automatic Multi-Label Image Annotation System Guided by Firefly Algorithm and Bayesian Method

Authors: Saad M. Darwish, Mohamed A. El-Iskandarani, Guitar M. Shawkat

Abstract:

Nowadays, the amount of available multimedia data is continuously on the rise. The need to find a required image for an ordinary user is a challenging task. Content based image retrieval (CBIR) computes relevance based on the visual similarity of low-level image features such as color, textures, etc. However, there is a gap between low-level visual features and semantic meanings required by applications. The typical method of bridging the semantic gap is through the automatic image annotation (AIA) that extracts semantic features using machine learning techniques. In this paper, a multi-label image annotation system guided by Firefly and Bayesian method is proposed. Firstly, images are segmented using the maximum variance intra cluster and Firefly algorithm, which is a swarm-based approach with high convergence speed, less computation rate and search for the optimal multiple threshold. Feature extraction techniques based on color features and region properties are applied to obtain the representative features. After that, the images are annotated using translation model based on the Net Bayes system, which is efficient for multi-label learning with high precision and less complexity. Experiments are performed using Corel Database. The results show that the proposed system is better than traditional ones for automatic image annotation and retrieval.

Keywords: feature extraction, feature selection, image annotation, classification

Procedia PDF Downloads 558
10963 Supplier Risk Management: A Multivariate Statistical Modelling and Portfolio Optimization Based Approach for Supplier Delivery Performance Development

Authors: Jiahui Yang, John Quigley, Lesley Walls

Abstract:

In this paper, the authors develop a stochastic model regarding the investment in supplier delivery performance development from a buyer’s perspective. The authors propose a multivariate model through a Multinomial-Dirichlet distribution within an Empirical Bayesian inference framework, representing both the epistemic and aleatory uncertainties in deliveries. A closed form solution is obtained and the lower and upper bound for both optimal investment level and expected profit under uncertainty are derived. The theoretical properties provide decision makers with useful insights regarding supplier delivery performance improvement problems where multiple delivery statuses are involved. The authors also extend the model from a single supplier investment into a supplier portfolio, using a Lagrangian method to obtain a theoretical expression for an optimal investment level and overall expected profit. The model enables a buyer to know how the marginal expected profit/investment level of each supplier changes with respect to the budget and which supplier should be invested in when additional budget is available. An application of this model is illustrated in a simulation study. Overall, the main contribution of this study is to provide an optimal investment decision making framework for supplier development, taking into account multiple delivery statuses as well as multiple projects.

Keywords: decision making, empirical bayesian, portfolio optimization, supplier development, supply chain management

Procedia PDF Downloads 266
10962 Design and Analysis of a Laminated Composite Automotive Drive Shaft

Authors: Hossein Kh. Bisheh, Nan Wu

Abstract:

Advanced composite materials have a great importance in engineering structures due to their high specific modulus and strength and low weight. These materials can be used in design and fabrication of automotive drive shafts to reduce the weight of the structure. Hence, an optimum design of a composite drive shaft satisfying the design criteria, can be an appropriate substitution of metallic drive shafts. The aim of this study is to design and analyze a composite automotive drive shaft with high specific strength and low weight satisfying the design criteria. Tsai-Wu criterion is chosen as the failure criterion. Various designs with different lay-ups and materials are investigated based on the design requirements and finally, an optimum design satisfying the design criteria is chosen based on the weight and cost considerations. The results of this study indicate that if the weight is the main concern, a shaft made of Carbon/Epoxy can be a good option, and if the cost is a more important parameter, a hybrid shaft made of aluminum and Carbon/Epoxy can be considered.

Keywords: Bending natural frequency, Composite drive shaft, Peak torque, Torsional buckling

Procedia PDF Downloads 202
10961 Reinforcement Learning the Born Rule from Photon Detection

Authors: Rodrigo S. Piera, Jailson Sales Ara´ujo, Gabriela B. Lemos, Matthew B. Weiss, John B. DeBrota, Gabriel H. Aguilar, Jacques L. Pienaar

Abstract:

The Born rule was historically viewed as an independent axiom of quantum mechanics until Gleason derived it in 1957 by assuming the Hilbert space structure of quantum measurements [1]. In subsequent decades there have been diverse proposals to derive the Born rule starting from even more basic assumptions [2]. In this work, we demonstrate that a simple reinforcement-learning algorithm, having no pre-programmed assumptions about quantum theory, will nevertheless converge to a behaviour pattern that accords with the Born rule, when tasked with predicting the output of a quantum optical implementation of a symmetric informationally-complete measurement (SIC). Our findings support a hypothesis due to QBism (the subjective Bayesian approach to quantum theory), which states that the Born rule can be thought of as a normative rule for making decisions in a quantum world [3].

Keywords: quantum Bayesianism, quantum theory, quantum information, quantum measurement

Procedia PDF Downloads 52
10960 Novel Inference Algorithm for Gaussian Process Classification Model with Multiclass and Its Application to Human Action Classification

Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park

Abstract:

In this paper, we propose a novel inference algorithm for the multi-class Gaussian process classification model that can be used in the field of human behavior recognition. This algorithm can drive simultaneously both a posterior distribution of a latent function and estimators of hyper-parameters in a Gaussian process classification model with multi-class. Our algorithm is based on the Laplace approximation (LA) technique and variational EM framework. This is performed in two steps: called expectation and maximization steps. First, in the expectation step, using the Bayesian formula and LA technique, we derive approximately the posterior distribution of the latent function indicating the possibility that each observation belongs to a certain class in the Gaussian process classification model. Second, in the maximization step, using a derived posterior distribution of latent function, we compute the maximum likelihood estimator for hyper-parameters of a covariance matrix necessary to define prior distribution for latent function. These two steps iteratively repeat until a convergence condition satisfies. Moreover, we apply the proposed algorithm with human action classification problem using a public database, namely, the KTH human action data set. Experimental results reveal that the proposed algorithm shows good performance on this data set.

Keywords: bayesian rule, gaussian process classification model with multiclass, gaussian process prior, human action classification, laplace approximation, variational EM algorithm

Procedia PDF Downloads 304