Search results for: generalized likelihood uncertainty estimation
3432 On the Design of Robust Governors of Steam Power Systems Using Polynomial and State-Space Based H∞ Techniques: A Comparative Study
Authors: Rami A. Maher, Ibraheem K. Ibraheem
Abstract:
This work presents a comparison study between the state-space and polynomial methods for the design of the robust governor for load frequency control of steam turbine power systems. The robust governor is synthesized using the two approaches and the comparison is extended to include time and frequency domains performance, controller order, and uncertainty representation, weighting filters, optimality and sub-optimality. The obtained results are represented through tables and curves with reasons of similarities and dissimilarities.Keywords: robust control, load frequency control, steam turbine, H∞-norm, system uncertainty, load disturbance
Procedia PDF Downloads 3993431 A Fuzzy Inference Tool for Assessing Cancer Risk from Radiation Exposure
Authors: Bouharati Lokman, Bouharati Imen, Bouharati Khaoula, Bouharati Oussama, Bouharati Saddek
Abstract:
Ionizing radiation exposure is an established cancer risk factor. Compared to other common environmental carcinogens, it is relatively easy to determine organ-specific radiation dose and, as a result, radiation dose-response relationships tend to be highly quantified. Nevertheless, there can be considerable uncertainty about questions of radiation-related cancer risk as they apply to risk protection and public policy, and the interpretations of interested parties can differ from one person to another. Examples of tools used in the analysis of the risk of developing cancer due to radiation are characterized by uncertainty. These uncertainties are related to the history of exposure and different assumptions involved in the calculation. We believe that the results of statistical calculations are characterized by uncertainty and imprecision. Having regard to the physiological variation from one person to another. In this study, we develop a tool based on fuzzy logic inference. As fuzzy logic deals with imprecise and uncertain, its application in this area is adequate. We propose a fuzzy system with three input variables (age, sex and body attainable cancer). The output variable expresses the risk of infringement rate of each organ. A base rule is established from recorded actual data. After successful simulation, this will instantly predict the risk of infringement rate of each body following chronic exposure to 0.1 Gy.Keywords: radiation exposure, cancer, modeling, fuzzy logic
Procedia PDF Downloads 3013430 Home Legacy Device Output Estimation Using Temperature and Humidity Information by Adaptive Neural Fuzzy Inference System
Authors: Sung Hyun Yoo, In Hwan Choi, Jun Ho Jung, Choon Ki Ahn, Myo Taeg Lim
Abstract:
Home energy management system (HEMS) has been issued to reduce the power consumption. The HEMS performs electric power control for the indoor electric device. However, HEMS commonly treats the smart devices. In this paper, we suggest the output estimation of home legacy device using the artificial neural fuzzy inference system (ANFIS). This paper discusses the overview and the architecture of the system. In addition, accurate performance of the output estimation using the ANFIS inference system is shown via a numerical example.Keywords: artificial neural fuzzy inference system (ANFIS), home energy management system (HEMS), smart device, legacy device
Procedia PDF Downloads 5343429 Investment Adjustments to Exchange Rate Fluctuations Evidence from Manufacturing Firms in Tunisia
Authors: Mourad Zmami Oussema BenSalha
Abstract:
The current research aims to assess empirically the reaction of private investment to exchange rate fluctuations in Tunisia using a sample of 548 firms operating in manufacturing industries between 1997 and 2002. The micro-econometric model we estimate is based on an accelerator-profit specification investment model increased by two variables that measure the variation and the volatility of exchange rates. Estimates using the system the GMM method reveal that the effects of the exchange rate depreciation on investment are negative since it increases the cost of imported capital goods. Turning to the exchange rate volatility, as measured by the GARCH (1,1) model, our findings assign a significant role to the exchange rate uncertainty in explaining the sluggishness of private investment in Tunisia in the full sample of firms. Other estimation attempts based on various sub samples indicate that the elasticities of investment relative to the exchange rate volatility depend upon many firms’ specific characteristics such as the size and the ownership structure.Keywords: investment, exchange rate volatility, manufacturing firms, system GMM, Tunisia
Procedia PDF Downloads 4003428 A Bayesian Multivariate Microeconometric Model for Estimation of Price Elasticity of Demand
Authors: Jefferson Hernandez, Juan Padilla
Abstract:
Estimation of price elasticity of demand is a valuable tool for the task of price settling. Given its relevance, it is an active field for microeconomic and statistical research. Price elasticity in the industry of oil and gas, in particular for fuels sold in gas stations, has shown to be a challenging topic given the market and state restrictions, and underlying correlations structures between the types of fuels sold by the same gas station. This paper explores the Lotka-Volterra model for the problem for price elasticity estimation in the context of fuels; in addition, it is introduced multivariate random effects with the purpose of dealing with errors, e.g., measurement or missing data errors. In order to model the underlying correlation structures, the Inverse-Wishart, Hierarchical Half-t and LKJ distributions are studied. Here, the Bayesian paradigm through Markov Chain Monte Carlo (MCMC) algorithms for model estimation is considered. Simulation studies covering a wide range of situations were performed in order to evaluate parameter recovery for the proposed models and algorithms. Results revealed that the proposed algorithms recovered quite well all model parameters. Also, a real data set analysis was performed in order to illustrate the proposed approach.Keywords: price elasticity, volume, correlation structures, Bayesian models
Procedia PDF Downloads 1533427 Development of a Shape Based Estimation Technology Using Terrestrial Laser Scanning
Authors: Gichun Cha, Byoungjoon Yu, Jihwan Park, Minsoo Park, Junghyun Im, Sehwan Park, Sujung Sin, Seunghee Park
Abstract:
The goal of this research is to estimate a structural shape change using terrestrial laser scanning. This study proceeds with development of data reduction and shape change estimation algorithm for large-capacity scan data. The point cloud of scan data was converted to voxel and sampled. Technique of shape estimation is studied to detect changes in structure patterns, such as skyscrapers, bridges, and tunnels based on large point cloud data. The point cloud analysis applies the octree data structure to speed up the post-processing process for change detection. The point cloud data is the relative representative value of shape information, and it used as a model for detecting point cloud changes in a data structure. Shape estimation model is to develop a technology that can detect not only normal but also immediate structural changes in the event of disasters such as earthquakes, typhoons, and fires, thereby preventing major accidents caused by aging and disasters. The study will be expected to improve the efficiency of structural health monitoring and maintenance.Keywords: terrestrial laser scanning, point cloud, shape information model, displacement measurement
Procedia PDF Downloads 2273426 Infilling Strategies for Surrogate Model Based Multi-disciplinary Analysis and Applications to Velocity Prediction Programs
Authors: Malo Pocheau-Lesteven, Olivier Le Maître
Abstract:
Engineering and optimisation of complex systems is often achieved through multi-disciplinary analysis of the system, where each subsystem is modeled and interacts with other subsystems to model the complete system. The coherence of the output of the different sub-systems is achieved through the use of compatibility constraints, which enforce the coupling between the different subsystems. Due to the complexity of some sub-systems and the computational cost of evaluating their respective models, it is often necessary to build surrogate models of these subsystems to allow repeated evaluation these subsystems at a relatively low computational cost. In this paper, gaussian processes are used, as their probabilistic nature is leveraged to evaluate the likelihood of satisfying the compatibility constraints. This paper presents infilling strategies to build accurate surrogate models of the subsystems in areas where they are likely to meet the compatibility constraint. It is shown that these infilling strategies can reduce the computational cost of building surrogate models for a given level of accuracy. An application of these methods to velocity prediction programs used in offshore racing naval architecture further demonstrates these method's applicability in a real engineering context. Also, some examples of the application of uncertainty quantification to field of naval architecture are presented.Keywords: infilling strategy, gaussian process, multi disciplinary analysis, velocity prediction program
Procedia PDF Downloads 1483425 Frequency Interpretation of a Wave Function, and a Vertical Waveform Treated as A 'Quantum Leap'
Authors: Anthony Coogan
Abstract:
Born’s probability interpretation of wave functions would have led to nearly identical results had he chosen a frequency interpretation instead. Logically, Born may have assumed that only one electron was under consideration, making it nonsensical to propose a frequency wave. Author’s suggestion: the actual experimental results were not of a single electron; rather, they were groups of reflected x-ray photons. The vertical waveform used by Scrhödinger in his Particle in the Box Theory makes sense if it was intended to represent a quantum leap. The author extended the single vertical panel to form a bar chart: separate panels would represent different energy levels. The proposed bar chart would be populated by reflected photons. Expansion of basic ideas: Part of Scrhödinger’s ‘Particle in the Box’ theory may be valid despite negative criticism. The waveform used in the diagram is vertical, which may seem absurd because real waves decay at a measurable rate, rather than instantaneously. However, there may be one notable exception. Supposedly, following from the theory, the Uncertainty Principle was derived – may a Quantum Leap not be represented as an instantaneous waveform? The great Scrhödinger must have had some reason to suggest a vertical waveform if the prevalent belief was that they did not exist. Complex wave forms representing a particle are usually assumed to be continuous. The actual observations made were x-ray photons, some of which had struck an electron, been reflected, and then moved toward a detector. From Born’s perspective, doing similar work the years in question 1926-7, he would also have considered a single electron – leading him to choose a probability distribution. Probability Distributions appear very similar to Frequency Distributions, but the former are considered to represent the likelihood of future events. Born’s interpretation of the results of quantum experiments led (or perhaps misled) many researchers into claiming that humans can influence events just by looking at them, e.g. collapsing complex wave functions by 'looking at the electron to see which slit it emerged from', while in reality light reflected from the electron moved in the observer’s direction after the electron had moved away. Astronomers may say that they 'look out into the universe' but are actually using logic opposed to the views of Newton and Hooke and many observers such as Romer, in that light carries information from a source or reflector to an observer, rather the reverse. Conclusion: Due to the controversial nature of these ideas, especially its implications about the nature of complex numbers used in applications in science and engineering, some time may pass before any consensus is reached.Keywords: complex wave functions not necessary, frequency distributions instead of wave functions, information carried by light, sketch graph of uncertainty principle
Procedia PDF Downloads 1943424 A Mixing Matrix Estimation Algorithm for Speech Signals under the Under-Determined Blind Source Separation Model
Authors: Jing Wu, Wei Lv, Yibing Li, Yuanfan You
Abstract:
The separation of speech signals has become a research hotspot in the field of signal processing in recent years. It has many applications and influences in teleconferencing, hearing aids, speech recognition of machines and so on. The sounds received are usually noisy. The issue of identifying the sounds of interest and obtaining clear sounds in such an environment becomes a problem worth exploring, that is, the problem of blind source separation. This paper focuses on the under-determined blind source separation (UBSS). Sparse component analysis is generally used for the problem of under-determined blind source separation. The method is mainly divided into two parts. Firstly, the clustering algorithm is used to estimate the mixing matrix according to the observed signals. Then the signal is separated based on the known mixing matrix. In this paper, the problem of mixing matrix estimation is studied. This paper proposes an improved algorithm to estimate the mixing matrix for speech signals in the UBSS model. The traditional potential algorithm is not accurate for the mixing matrix estimation, especially for low signal-to noise ratio (SNR).In response to this problem, this paper considers the idea of an improved potential function method to estimate the mixing matrix. The algorithm not only avoids the inuence of insufficient prior information in traditional clustering algorithm, but also improves the estimation accuracy of mixing matrix. This paper takes the mixing of four speech signals into two channels as an example. The results of simulations show that the approach in this paper not only improves the accuracy of estimation, but also applies to any mixing matrix.Keywords: DBSCAN, potential function, speech signal, the UBSS model
Procedia PDF Downloads 1293423 The Influence of Air Temperature Controls in Estimation of Air Temperature over Homogeneous Terrain
Authors: Fariza Yunus, Jasmee Jaafar, Zamalia Mahmud, Nurul Nisa’ Khairul Azmi, Nursalleh K. Chang, Nursalleh K. Chang
Abstract:
Variation of air temperature from one place to another is cause by air temperature controls. In general, the most important control of air temperature is elevation. Another significant independent variable in estimating air temperature is the location of meteorological stations. Distances to coastline and land use type are also contributed to significant variations in the air temperature. On the other hand, in homogeneous terrain direct interpolation of discrete points of air temperature work well to estimate air temperature values in un-sampled area. In this process the estimation is solely based on discrete points of air temperature. However, this study presents that air temperature controls also play significant roles in estimating air temperature over homogenous terrain of Peninsular Malaysia. An Inverse Distance Weighting (IDW) interpolation technique was adopted to generate continuous data of air temperature. This study compared two different datasets, observed mean monthly data of T, and estimation error of T–T’, where T’ estimated value from a multiple regression model. The multiple regression model considered eight independent variables of elevation, latitude, longitude, coastline, and four land use types of water bodies, forest, agriculture and build up areas, to represent the role of air temperature controls. Cross validation analysis was conducted to review accuracy of the estimation values. Final results show, estimation values of T–T’ produced lower errors for mean monthly mean air temperature over homogeneous terrain in Peninsular Malaysia.Keywords: air temperature control, interpolation analysis, peninsular Malaysia, regression model, air temperature
Procedia PDF Downloads 3683422 An Empirical Analysis of the Relation between Entrepreneur's Leadership and Team Creativity: The Role of Psychological Empowerment, Cognitive Diversity, and Environmental Uncertainty
Authors: Rui Xing, Xiaowen Zhao, Hao Huang, Chang Liu
Abstract:
Creativity is regarded as vital for new ventures' development since the whole process of entrepreneurship is rooted in the creation and exploration of new ideas. The entrepreneurial leader is central to the entrepreneurial team, who plays an especially important role in this process. However, few scholars have studied the impact entrepreneurs' leadership styles on the creativity of entrepreneurial teams. In this study, we integrate the historically disjointed literatures of leadership style and team creativity under entrepreneurship circumstance to understand why and when entrepreneurs' different leadership style relates to team creativity. Focus on answering the following questions: Is humility leadership necessarily better than narcissism leadership at increasing the creativity of entrepreneurial teams? Moreover, in which situations humility leadership or narcissism leadership is more conducive to the entrepreneurial team's creativity? Based on the componential theory of creativity and entrepreneurial cognition theory, we explore the relationship between entrepreneurs' leadership style and team creativity, treating team cognitive diversity and environmental uncertainty as moderators and psychological empowerment as mediators. We tested our hypotheses using data gathered from 64 teams and 256 individual members from 53 new firms in China's first-tier cities such as Beijing and Shanghai. We found that there was a significant positive relation between entrepreneurs' humble leadership and psychological empowerment, and the more significant the positive correlation was when the environmental uncertainty was high. In addition, there was a significant negative relation between entrepreneurs' narcissistic leadership and psychological empowerment, and the negative relation was weaker in teams with a high team cognitive diversity value. Furthermore, both entrepreneurs' humble leadership and team psychological empowerment were significantly positively related to team creativity. While entrepreneurs' narcissistic leadership was negatively related to team creativity, and the negative relationship was weaker in teams with a high team cognitive diversity or a high environmental uncertainty value. This study has some implications for both scholars and entrepreneurs. Firstly, our study enriches the understanding of the role of leadership in entrepreneurial team creativity. Different from previous team creativity literatures, focusing on TMT and R&D team, this study is a significant attempt to demonstrate that entrepreneurial leadership style is particularly relevant to the core requirements of team creativity. Secondly, this study introduces two moderating variables, cognitive diversity and environmental uncertainty, to explore the different boundary conditions under which the two leadership styles play their roles, which is helpful for entrepreneurs to understand how to leverage leadership to improve entrepreneurial team creativity, how to recruit cognitively diverse employees to moderate the effects of inappropriate leadership to the team. Finally, our findings showed that entrepreneurs' humble leadership makes a unique contribution to explaining team creativity through team psychological empowerment.Keywords: entrepreneurs’ leadership style, entrepreneurial team creativity, team psychological empowerment, team cognitive diversity, environmental uncertainty
Procedia PDF Downloads 1253421 The European Research and Development Project Improved Nuclear Site Characterization for Waste Minimization in Decommissioning under Constrained Environment: Focus on Performance Analysis and Overall Uncertainty
Authors: M. Crozet, D. Roudil, T. Branger, S. Boden, P. Peerani, B. Russell, M. Herranz, L. Aldave de la Heras
Abstract:
The EURATOM work program project INSIDER (Improved Nuclear Site Characterization for Waste minimization in Decommissioning under Constrained Environment) was launched in June 2017. This 4-year project has 18 partners and aims at improving the management of contaminated materials arising from decommissioning and dismantling (D&D) operations by proposing an integrated methodology of characterization. This methodology is based on advanced statistical processing and modelling, coupled with adapted and innovative analytical and measurement methods, with respect to sustainability and economic objectives. In order to achieve these objectives, the approaches will be then applied to common case studies in the form of Inter-laboratory comparisons on matrix representative reference samples and benchmarking. Work Package 6 (WP6) ‘Performance analysis and overall uncertainty’ is in charge of the analysis of the benchmarking on real samples, the organisation of inter-laboratory comparison on synthetic certified reference materials and the establishment of overall uncertainty budget. Assessment of the outcome will be used for providing recommendations and guidance resulting in pre-standardization tests.Keywords: decommissioning, sampling strategy, research and development, characterization, European project
Procedia PDF Downloads 3583420 When Sex Matters: A Comparative Generalized Structural Equation Model (GSEM) for the Determinants of Stunting Amongst Under-fives in Uganda
Authors: Vallence Ngabo M., Leonard Atuhaire, Peter Clever Rutayisire
Abstract:
The main aim of this study was to establish the differences in both the determinants of stunting and the causal mechanism through which the identified determinants influence stunting amongst male and female under-fives in Uganda. Literature shows that male children below the age of five years are at a higher risk of being stunted than their female counterparts. Specifically, studies in Uganda indicate that being a male child is positively associated with stunting, while being a female is negatively associated with stunting. Data for 904 males and 829 females under-fives was extracted form UDHS-2016 survey dataset. Key variables for this study were identified and used in generating relevant models and paths. Structural equation modeling techniques were used in their generalized form (GSEM). The generalized nature necessitated specifying both the family and link functions for each response variable in the system of the model. The sex of the child (b4) was used as a grouping factor and the height for age (HAZ) scores were used to construct the status for stunting of under-fives. The estimated models and path clearly indicated that the set of underlying factors that influence male and female under-fives respectively was different and the path through which they influence stunting was different. However, some of the determinants that influenced stunting amongst male under-fives also influenced stunting amongst the female under-fives. To reduce the stunting problem to the desirable state, it is important to consider the multifaceted and complex nature of the risk factors that influence stunting amongst the under-fives but, more importantly, consider the different sex-specific factors and their causal mechanism or paths through which they influence stunting.Keywords: stunting, underfives, sex of the child, GSEM, causal mechanism
Procedia PDF Downloads 1273419 Downside Risk Analysis of the Nigerian Stock Market: A Value at Risk Approach
Authors: Godwin Chigozie Okpara
Abstract:
This paper using standard GARCH, EGARCH, and TARCH models on day of the week return series (of 246 days) from the Nigerian Stock market estimated the model variants’ VaR. An asymmetric return distribution and fat-tail phenomenon in financial time series were considered by estimating the models with normal, student t and generalized error distributions. The analysis based on Akaike Information Criterion suggests that the EGARCH model with student t innovation distribution can furnish more accurate estimate of VaR. In the light of this, we apply the likelihood ratio tests of proportional failure rates to VaR derived from EGARCH model in order to determine the short and long positions VaR performances. The result shows that as alpha ranges from 0.05 to 0.005 for short positions, the failure rate significantly exceeds the prescribed quintiles while it however shows no significant difference between the failure rate and the prescribed quantiles for long positions. This suggests that investors and portfolio managers in the Nigeria stock market have long trading position or can buy assets with concern on when the asset prices will fall. Precisely, the VaR estimates for the long position range from -4.7% for 95 percent confidence level to -10.3% for 99.5 percent confidence level.Keywords: downside risk, value-at-risk, failure rate, kupiec LR tests, GARCH models
Procedia PDF Downloads 4333418 Indoor Real-Time Positioning and Mapping Based on Manhattan Hypothesis Optimization
Authors: Linhang Zhu, Hongyu Zhu, Jiahe Liu
Abstract:
This paper investigated a method of indoor real-time positioning and mapping based on the Manhattan world assumption. In indoor environments, relying solely on feature matching techniques or other geometric algorithms for sensor pose estimation inevitably resulted in cumulative errors, posing a significant challenge to indoor positioning. To address this issue, we adopt the Manhattan world hypothesis to optimize the camera pose algorithm based on feature matching, which improves the accuracy of camera pose estimation. A special processing method was applied to image data frames that conformed to the Manhattan world assumption. When similar data frames appeared subsequently, this could be used to eliminate drift in sensor pose estimation, thereby reducing cumulative errors in estimation and optimizing mapping and positioning. Through experimental verification, it is found that our method achieves high-precision real-time positioning in indoor environments and successfully generates maps of indoor environments. This provides effective technical support for applications such as indoor navigation and robot control.Keywords: Manhattan world hypothesis, real-time positioning and mapping, feature matching, loopback detection
Procedia PDF Downloads 533417 Effects of Different Meteorological Variables on Reference Evapotranspiration Modeling: Application of Principal Component Analysis
Authors: Akinola Ikudayisi, Josiah Adeyemo
Abstract:
The correct estimation of reference evapotranspiration (ETₒ) is required for effective irrigation water resources planning and management. However, there are some variables that must be considered while estimating and modeling ETₒ. This study therefore determines the multivariate analysis of correlated variables involved in the estimation and modeling of ETₒ at Vaalharts irrigation scheme (VIS) in South Africa using Principal Component Analysis (PCA) technique. Weather and meteorological data between 1994 and 2014 were obtained both from South African Weather Service (SAWS) and Agricultural Research Council (ARC) in South Africa for this study. Average monthly data of minimum and maximum temperature (°C), rainfall (mm), relative humidity (%), and wind speed (m/s) were the inputs to the PCA-based model, while ETₒ is the output. PCA technique was adopted to extract the most important information from the dataset and also to analyze the relationship between the five variables and ETₒ. This is to determine the most significant variables affecting ETₒ estimation at VIS. From the model performances, two principal components with a variance of 82.7% were retained after the eigenvector extraction. The results of the two principal components were compared and the model output shows that minimum temperature, maximum temperature and windspeed are the most important variables in ETₒ estimation and modeling at VIS. In order words, ETₒ increases with temperature and windspeed. Other variables such as rainfall and relative humidity are less important and cannot be used to provide enough information about ETₒ estimation at VIS. The outcome of this study has helped to reduce input variable dimensionality from five to the three most significant variables in ETₒ modelling at VIS, South Africa.Keywords: irrigation, principal component analysis, reference evapotranspiration, Vaalharts
Procedia PDF Downloads 2453416 Science School Was Burned: A Case Study of Crisis Management in Thailand
Authors: Proud Arunrangsiwed
Abstract:
This study analyzes the crisis management and image repair strategies during the crisis of Mahidol Wittayanusorn School (MWIT) library burning. The library of this school was burned by a 16-year-old-male student on June 6th, 2010. This student blamed the school that the lesson was difficult, and other students were selfish. Although no one was in the building during the fire, it had caused damage to the building, books and electronic supplies around 130 million bahts (4.4 million USD). This event aroused many discourses arguing about the education system and morality. The strategies which were used during crisis were denial, shift the blame, bolstering, minimization, and uncertainty reduction. The results of using these strategies appeared after the crisis. That was the numbers of new students, who registered for the examination to get into this school in the later years, have remained the same.Keywords: school, crisis management, violence, image repair strategies, uncertainty, burn
Procedia PDF Downloads 4653415 Generalized Model Estimating Strength of Bauxite Residue-Lime Mix
Authors: Sujeet Kumar, Arun Prasad
Abstract:
The present work investigates the effect of multiple parameters on the unconfined compressive strength of the bauxite residue-lime mix. A number of unconfined compressive strength tests considering various curing time, lime content, dry density and moisture content were carried out. The results show that an empirical correlation may be successfully developed using volumetric lime content, porosity, moisture content, curing time unconfined compressive strength for the range of the bauxite residue-lime mix studied. The proposed empirical correlations efficiently predict the strength of bauxite residue-lime mix, and it can be used as a generalized empirical equation to estimate unconfined compressive strength.Keywords: bauxite residue, curing time, porosity/volumetric lime ratio, unconfined compressive strength
Procedia PDF Downloads 2263414 An Elaboration Likelihood Model to Evaluate Consumer Behavior on Facebook Marketplace: Trust on Seller as a Moderator
Authors: Sharmistha Chowdhury, Shuva Chowdhury
Abstract:
Buying-selling new as well as second-hand goods like tools, furniture, household, electronics, clothing, baby stuff, vehicles, and hobbies through the Facebook marketplace has become a new paradigm for c2c sellers. This phenomenon encourages and empowers decentralised home-oriented sellers. This study adopts Elaboration Likelihood Model (ELM) to explain consumer behaviour on Facebook Marketplace (FM). ELM suggests that consumers process information through the central and peripheral routes, which eventually shape their attitudes towards posts. The central route focuses on information quality, and the peripheral route focuses on cues. Sellers’ FM posts usually include product features, prices, conditions, pictures, and pick-up location. This study uses information relevance and accuracy as central route factors. The post’s attractiveness represents cues and creates positive or negative associations with the product. A post with remarkable pictures increases the attractiveness of the post. So, post aesthetics is used as a peripheral route factor. People influenced via the central or peripheral route forms an attitude that includes multiple processes – response and purchase intention. People respond to FM posts through save, share and chat. Purchase intention reflects a positive image of the product and higher purchase intention. This study proposes trust on sellers as a moderator to test the strength of its influence on consumer attitudes and behaviour. Trust on sellers is assessed whether sellers have badges or not. A sample questionnaire will be developed and distributed among a group of random FM sellers who are selling vehicles on this platform to conduct the study. The chosen product of this study is the vehicle, a high-value purchase item. High-value purchase requires consumers to consider forming their attitude without any sign of impulsiveness seriously. Hence, vehicles are the perfect choice to test the strength of consumers attitudes and behaviour. The findings of the study add to the elaboration likelihood model and online second-hand marketplace literature.Keywords: consumer behaviour, elaboration likelihood model, facebook marketplace, c2c marketing
Procedia PDF Downloads 1313413 Determinants of International Volatility Passthroughs of Agricultural Commodities: A Panel Analysis of Developing Countries
Authors: Tetsuji Tanaka, Jin Guo
Abstract:
The extant literature has not succeeded in uncovering the common determinants of price volatility transmissions of agricultural commodities from international to local markets, and further, has rarely investigated the role of self-sufficiency measures in the context of national food security. We analyzed various factors to determine the degree of price volatility transmissions of wheat, rice, and maize between world and domestic markets using GARCH models with dynamic conditional correlation (DCC) specifications and panel-feasible generalized least square models. We found that the grain autarky system has the potential to diminish volatility pass-throughs for three grain commodities. Furthermore, it was discovered that the substitutive commodity consumption behavior between maize and wheat buffers the volatility transmissions of both, but rice does not function as a transmission-relieving element, either for the volatilities of wheat or maize. The effectiveness of grain consumption substitution to insulate the pass-throughs from global markets is greater than that of cereal self-sufficiency. These implications are extremely beneficial for developing governments to protect their domestic food markets from uncertainty in foreign countries and as such, improves food security.Keywords: food security, GARCH, grain self-sufficiency, volatility transmission
Procedia PDF Downloads 1483412 Supply Chain Fit and Firm Performance: The Role of the Environment
Authors: David Gligor
Abstract:
The purpose of this study was to build on Fisher's (1997) seminal article. First, it sought to determine how companies can achieve supply chain fit (i.e., match between the products' characteristics and the underlying supply chain design). Second, it attempted to develop a better understanding of how environmental conditions impact the relationship between supply chain fit and performance. The findings indicate that firm supply chain agility allows organizations to quickly adjust the structure of their supply chains and therefore, achieve supply chain fit. In addition, archival and survey data were used to explore the moderating effects of six environmental uncertainty dimensions: munificence, market dynamism, technological dynamism, technical complexity, product diversity, and geographic dispersion. All environmental variables, except technological dynamism, were found to impact the relationship between supply chain fit and firm performance.Keywords: supply chain fit, environmental uncertainty, supply chain agility, management engineering
Procedia PDF Downloads 5893411 Estimation of Coefficient of Discharge of Side Trapezoidal Labyrinth Weir Using Group Method of Data Handling Technique
Authors: M. A. Ansari, A. Hussain, A. Uddin
Abstract:
A side weir is a flow diversion structure provided in the side wall of a channel to divert water from the main channel to a branch channel. The trapezoidal labyrinth weir is a special type of weir in which crest length of the weir is increased to pass higher discharge. Experimental and numerical studies related to the coefficient of discharge of trapezoidal labyrinth weir in an open channel have been presented in the present study. Group Method of Data Handling (GMDH) with the transfer function of quadratic polynomial has been used to predict the coefficient of discharge for the side trapezoidal labyrinth weir. A new model is developed for coefficient of discharge of labyrinth weir by regression method. Generalized models for predicting the coefficient of discharge for labyrinth weir using Group Method of Data Handling (GMDH) network have also been developed. The prediction based on GMDH model is more satisfactory than those given by traditional regression equations.Keywords: discharge coefficient, group method of data handling, open channel, side labyrinth weir
Procedia PDF Downloads 1513410 Applying Serious Game Design Frameworks to Existing Games for Integration of Custom Learning Objectives
Authors: Jonathan D. Moore, Mark G. Reith, David S. Long
Abstract:
Serious games (SGs) have been shown to be an effective teaching tool in many contexts. Because of the success of SGs, several design frameworks have been created to expedite the process of making original serious games to teach specific learning objectives (LOs). Even with these frameworks, the time required to create a custom SG from conception to implementation can range from months to years. Furthermore, it is even more difficult to design a game framework that allows an instructor to create customized game variants supporting multiple LOs within the same field. This paper proposes a refactoring methodology to apply the theoretical principles from well-established design frameworks to a pre-existing serious game. The expected result is a generalized game that can be quickly customized to teach LOs not originally targeted by the game. This methodology begins by describing the general components in a game, then uses a combination of two SG design frameworks to extract the teaching elements present in the game. The identified teaching elements are then used as the theoretical basis to determine the range of LOs that can be taught by the game. This paper evaluates the proposed methodology by presenting a case study of refactoring the serious game Battlespace Next (BSN) to teach joint military capabilities. The range of LOs that can be taught by the generalized BSN are identified, and examples of creating custom LOs are given. Survey results from users of the generalized game are also provided. Lastly, the expected impact of this work is discussed and a road map for future work and evaluation is presented.Keywords: serious games, learning objectives, game design, learning theory, game framework
Procedia PDF Downloads 1083409 Effect of Specimen Thickness on Probability Distribution of Grown Crack Size in Magnesium Alloys
Authors: Seon Soon Choi
Abstract:
The fatigue crack growth is stochastic because of the fatigue behavior having an uncertainty and a randomness. Therefore, it is necessary to determine the probability distribution of a grown crack size at a specific fatigue crack propagation life for maintenance of structure as well as reliability estimation. The essential purpose of this study is to present the good probability distribution fit for the grown crack size at a specified fatigue life in a rolled magnesium alloy under different specimen thickness conditions. Fatigue crack propagation experiments are carried out in laboratory air under three conditions of specimen thickness using AZ31 to investigate a stochastic crack growth behavior. The goodness-of-fit test for probability distribution of a grown crack size under different specimen thickness conditions is performed by Anderson-Darling test. The effect of a specimen thickness on variability of a grown crack size is also investigated.Keywords: crack size, fatigue crack propagation, magnesium alloys, probability distribution, specimen thickness
Procedia PDF Downloads 4913408 An Energy Detection-Based Algorithm for Cooperative Spectrum Sensing in Rayleigh Fading Channel
Authors: H. Bakhshi, E. Khayyamian
Abstract:
Cognitive radios have been recognized as one of the most promising technologies dealing with the scarcity of the radio spectrum. In cognitive radio systems, secondary users are allowed to utilize the frequency bands of primary users when the bands are idle. Hence, how to accurately detect the idle frequency bands has attracted many researchers’ interest. Detection performance is sensitive toward noise power and gain fluctuation. Since signal to noise ratio (SNR) between primary user and secondary users are not the same and change over the time, SNR and noise power estimation is essential. In this paper, we present a cooperative spectrum sensing algorithm using SNR estimation to improve detection performance in the real situation.Keywords: cognitive radio, cooperative spectrum sensing, energy detection, SNR estimation, spectrum sensing, rayleigh fading channel
Procedia PDF Downloads 4403407 Marginalized Two-Part Joint Models for Generalized Gamma Family of Distributions
Authors: Mohadeseh Shojaei Shahrokhabadi, Ding-Geng (Din) Chen
Abstract:
Positive continuous outcomes with a substantial number of zero values and incomplete longitudinal follow-up are quite common in medical cost data. To jointly model semi-continuous longitudinal cost data and survival data and to provide marginalized covariate effect estimates, a marginalized two-part joint model (MTJM) has been developed for outcome variables with lognormal distributions. In this paper, we propose MTJM models for outcome variables from a generalized gamma (GG) family of distributions. The GG distribution constitutes a general family that includes approximately all of the most frequently used distributions like the Gamma, Exponential, Weibull, and Log Normal. In the proposed MTJM-GG model, the conditional mean from a conventional two-part model with a three-parameter GG distribution is parameterized to provide the marginal interpretation for regression coefficients. In addition, MTJM-gamma and MTJM-Weibull are developed as special cases of MTJM-GG. To illustrate the applicability of the MTJM-GG, we applied the model to a set of real electronic health record data recently collected in Iran, and we provided SAS code for application. The simulation results showed that when the outcome distribution is unknown or misspecified, which is usually the case in real data sets, the MTJM-GG consistently outperforms other models. The GG family of distribution facilitates estimating a model with improved fit over the MTJM-gamma, standard Weibull, or Log-Normal distributions.Keywords: marginalized two-part model, zero-inflated, right-skewed, semi-continuous, generalized gamma
Procedia PDF Downloads 1673406 Probabilistic Analysis of Fiber-Reinforced Infinite Slopes
Authors: Assile Abou Diab, Shadi Najjar
Abstract:
Fiber-reinforcement is an effective soil improvement technique for applications involving the prevention of shallow failures on the slope face and the repair of existing slope failures. A typical application is the stabilization of cohesionless infinite slopes. The objective of this paper is to present a probabilistic, reliability-based methodology (based on Monte Carlo simulations) for the design of a practical fiber-reinforced cohesionless infinite slope, taking into consideration the impact of various sources of uncertainty. Recommendations are made regarding the required factors of safety that need to be used to achieve a given target reliability level. These factors of safety could differ from the traditional deterministic factor of safety.Keywords: factor of safety, fiber reinforcement, infinite slope, reliability-based design, uncertainty
Procedia PDF Downloads 3583405 Predicting Dose Level and Length of Time for Radiation Exposure Using Gene Expression
Authors: Chao Sima, Shanaz Ghandhi, Sally A. Amundson, Michael L. Bittner, David J. Brenner
Abstract:
In a large-scale radiologic emergency, potentially affected population need to be triaged efficiently using various biomarkers where personal dosimeters are not likely worn by the individuals. It has long been established that radiation injury can be estimated effectively using panels of genetic biomarkers. Furthermore, the rate of radiation, in addition to dose of radiation, plays a major role in determining biological responses. Therefore, a better and more accurate triage involves estimating both the dose level of the exposure and the length of time of that exposure. To that end, a large in vivo study was carried out on mice with internal emitter caesium-137 (¹³⁷Cs). Four different injection doses of ¹³⁷Cs were used: 157.5 μCi, 191 μCi, 214.5μCi, and 259 μCi. Cohorts of 6~7 mice from the control arm and each of the dose levels were sacrificed, and blood was collected 2, 3, 5, 7 and 14 days after injection for microarray RNA gene expression analysis. Using a generalized linear model with penalized maximum likelihood, a panel of 244 genes was established and both the doses of injection and the number of days after injection were accurately predicted for all 155 subjects using this panel. This has proven that microarray gene expression can be used effectively in radiation biodosimetry in predicting both the dose levels and the length of exposure time, which provides a more holistic view on radiation exposure and helps improving radiation damage assessment and treatment.Keywords: caesium-137, gene expression microarray, multivariate responses prediction, radiation biodosimetry
Procedia PDF Downloads 1893404 Using AI to Advance Factory Planning: A Case Study to Identify Success Factors of Implementing an AI-Based Demand Planning Solution
Authors: Ulrike Dowie, Ralph Grothmann
Abstract:
Rational planning decisions are based upon forecasts. Precise forecasting has, therefore, a central role in business. The prediction of customer demand is a prime example. This paper introduces recurrent neural networks to model customer demand and combines the forecast with uncertainty measures to derive decision support of the demand planning department. It identifies and describes the keys to the successful implementation of an AI-based solution: bringing together data with business knowledge, AI methods, and user experience, and applying agile software development practices.Keywords: agile software development, AI project success factors, deep learning, demand forecasting, forecast uncertainty, neural networks, supply chain management
Procedia PDF Downloads 1733403 Application of IF Rough Data on Knowledge Towards Malaria of Rural Tribal Communities in Tripura
Authors: Chhaya Gangwal, R. N. Bhaumik, Shishir Kumar
Abstract:
Handling uncertainty and impreciseness of knowledge appears to be a challenging task in Information Systems. Intuitionistic fuzzy (IF) and rough set theory enhances databases by allowing it for the management of uncertainty and impreciseness. This paper presents a new efficient query optimization technique for the multi-valued or imprecise IF rough database. The usefulness of this technique was illustrated on malaria knowledge from the rural tribal communities of Tripura where most of the information is multi-valued and imprecise. Then, the querying about knowledge on malaria is executed into SQL server to make the implementation of IF rough data querying simpler.Keywords: intuitionistic fuzzy set, rough set, relational database, IF rough relational database
Procedia PDF Downloads 429