Search results for: elaboration likelihood model theory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19965

Search results for: elaboration likelihood model theory

19695 Media Richness Perspective on Web 2.0 Usage for Knowledge Creation: The Case of the Cocoa Industry in Ghana

Authors: Albert Gyamfi

Abstract:

Cocoa plays critical role in the socio-economic development of Ghana. Meanwhile, smallholder farmers most of whom are illiterate dominate the industry. According to the cocoa-based agricultural knowledge and information system (AKIS) model knowledge is created and transferred to the industry between three key actors: cocoa researchers, extension experts, and cocoa farmers. Dwelling on the SECI model, the media richness theory (MRT), and the AKIS model, a conceptual model of web 2.0-based AKIS model (AKIS 2.0) is developed and used to assess the possible effects of social media usage for knowledge creation in the Ghanaian cocoa industry. A mixed method approach with a survey questionnaire was employed, and a second-order multi-group structural equation model (SEM) was used to analyze the data. The study concludes that the use of web 2.0 applications for knowledge creation would lead to sustainable interactions among the key knowledge actors for effective knowledge creation in the cocoa industry in Ghana.

Keywords: agriculture, cocoa, knowledge, media, web 2.0

Procedia PDF Downloads 302
19694 Acceptance of Health Information Application in Smart National Identity Card (SNIC) Using a New I-P Framework

Authors: Ismail Bile Hassan, Masrah Azrifah Azmi Murad

Abstract:

This study discovers a novel framework of individual level technology adoption known as I-P (Individual- Privacy) towards Smart National Identity Card health information application. Many countries introduced smart national identity card (SNIC) with various applications such as health information application embedded inside it. However, the degree to which citizens accept and use some of the embedded applications in smart national identity remains unknown to many governments and application providers as well. Moreover, the previous studies revealed that the factors of trust, perceived risk, privacy concern and perceived credibility need to be incorporated into more comprehensive models such as extended Unified Theory of Acceptance and Use of Technology known as UTAUT2. UTAUT2 is a mainly widespread and leading theory existing in the information system literature up to now. This research identifies factors affecting the citizens’ behavioural intention to use health information application embedded in SNIC and extends better understanding on the relevant factors that the government and the application providers would need to consider in predicting citizens’ new technology acceptance in the future. We propose a conceptual framework by combining the UTAUT2 and Privacy Calculus Model constructs and also adding perceived credibility as a new variable. The proposed framework may provide assistance to any government planning, decision, and policy makers involving e-government projects. The empirical study may be conducted in the future to provide proof and empirically validate this I-P framework.

Keywords: unified theory of acceptance and use of technology (UTAUT) model, UTAUT2 model, smart national identity card (SNIC), health information application, privacy calculus model (PCM)

Procedia PDF Downloads 436
19693 Pure and Mixed Nash Equilibria Domain of a Discrete Game Model with Dichotomous Strategy Space

Authors: A. S. Mousa, F. Shoman

Abstract:

We present a discrete game theoretical model with homogeneous individuals who make simultaneous decisions. In this model the strategy space of all individuals is a discrete and dichotomous set which consists of two strategies. We fully characterize the coherent, split and mixed strategies that form Nash equilibria and we determine the corresponding Nash domains for all individuals. We find all strategic thresholds in which individuals can change their mind if small perturbations in the parameters of the model occurs.

Keywords: coherent strategy, split strategy, pure strategy, mixed strategy, Nash equilibrium, game theory

Procedia PDF Downloads 119
19692 A Numerical Model Simulation for an Updraft Gasifier Using High-Temperature Steam

Authors: T. M. Ismail, M. A. El-Salam

Abstract:

A mathematical model study was carried out to investigate gasification of biomass fuels using high-temperature air and steam as a gasifying agent using high-temperature air up to 1000°C. In this study, a 2D computational fluid dynamics model was developed to study the gasification process in an updraft gasifier, considering drying, pyrolysis, combustion, and gasification reactions. The gas and solid phases were resolved using a Euler−Euler multiphase approach, with exchange terms for the momentum, mass, and energy. The standard k−ε turbulence model was used in the gas phase, and the particle phase was modeled using the kinetic theory of granular flow. The results show that the present model giving a promising way in its capability and sensitivity for the parameter effects that influence the gasification process.

Keywords: computational fluid dynamics, gasification, biomass fuel, fixed bed gasifier

Procedia PDF Downloads 367
19691 Time of Week Intensity Estimation from Interval Censored Data with Application to Police Patrol Planning

Authors: Jiahao Tian, Michael D. Porter

Abstract:

Law enforcement agencies are tasked with crime prevention and crime reduction under limited resources. Having an accurate temporal estimate of the crime rate would be valuable to achieve such a goal. However, estimation is usually complicated by the interval-censored nature of crime data. We cast the problem of intensity estimation as a Poisson regression using an EM algorithm to estimate the parameters. Two special penalties are added that provide smoothness over the time of day and day of the week. This approach presented here provides accurate intensity estimates and can also uncover day-of-week clusters that share the same intensity patterns. Anticipating where and when crimes might occur is a key element to successful policing strategies. However, this task is complicated by the presence of interval-censored data. The censored data refers to the type of data that the event time is only known to lie within an interval instead of being observed exactly. This type of data is prevailing in the field of criminology because of the absence of victims for certain types of crime. Despite its importance, the research in temporal analysis of crime has lagged behind the spatial component. Inspired by the success of solving crime-related problems with a statistical approach, we propose a statistical model for the temporal intensity estimation of crime with censored data. The model is built on Poisson regression and has special penalty terms added to the likelihood. An EM algorithm was derived to obtain maximum likelihood estimates, and the resulting model shows superior performance to the competing model. Our research is in line with the smart policing initiative (SPI) proposed by the Bureau Justice of Assistance (BJA) as an effort to support law enforcement agencies in building evidence-based, data-driven law enforcement tactics. The goal is to identify strategic approaches that are effective in crime prevention and reduction. In our case, we allow agencies to deploy their resources for a relatively short period of time to achieve the maximum level of crime reduction. By analyzing a particular area within cities where data are available, our proposed approach could not only provide an accurate estimate of intensities for the time unit considered but a time-variation crime incidence pattern. Both will be helpful in the allocation of limited resources by either improving the existing patrol plan with the understanding of the discovery of the day of week cluster or supporting extra resources available.

Keywords: cluster detection, EM algorithm, interval censoring, intensity estimation

Procedia PDF Downloads 43
19690 Development of Graph-Theoretic Model for Ranking Top of Rail Lubricants

Authors: Subhash Chandra Sharma, Mohammad Soleimani

Abstract:

Selection of the correct lubricant for the top of rail application is a complex process. In this paper, the selection of the proper lubricant for a Top-Of-Rail (TOR) lubrication system based on graph theory and matrix approach has been developed. Attributes influencing the selection process and their influence on each other has been represented through a digraph and an equivalent matrix. A matrix function which is called the Permanent Function is derived. By substituting the level of inherent contribution of the influencing parameters and their influence on each other qualitatively, a criterion called Suitability Index is derived. Based on these indices, lubricants can be ranked for their suitability. The proposed model can be useful for maintenance engineers in selecting the best lubricant for a TOR application. The proposed methodology is illustrated step–by-step through an example.

Keywords: lubricant selection, top of rail lubrication, graph-theory, Ranking of lubricants

Procedia PDF Downloads 268
19689 Modelling Operational Risk Using Extreme Value Theory and Skew t-Copulas via Bayesian Inference

Authors: Betty Johanna Garzon Rozo, Jonathan Crook, Fernando Moreira

Abstract:

Operational risk losses are heavy tailed and are likely to be asymmetric and extremely dependent among business lines/event types. We propose a new methodology to assess, in a multivariate way, the asymmetry and extreme dependence between severity distributions, and to calculate the capital for Operational Risk. This methodology simultaneously uses (i) several parametric distributions and an alternative mix distribution (the Lognormal for the body of losses and the Generalized Pareto Distribution for the tail) via extreme value theory using SAS®, (ii) the multivariate skew t-copula applied for the first time for operational losses and (iii) Bayesian theory to estimate new n-dimensional skew t-copula models via Markov chain Monte Carlo (MCMC) simulation. This paper analyses a newly operational loss data set, SAS Global Operational Risk Data [SAS OpRisk], to model operational risk at international financial institutions. All the severity models are constructed in SAS® 9.2. We implement the procedure PROC SEVERITY and PROC NLMIXED. This paper focuses in describing this implementation.

Keywords: operational risk, loss distribution approach, extreme value theory, copulas

Procedia PDF Downloads 564
19688 Financial Market Reaction to Non-Financial Reports

Authors: Petra Dilling

Abstract:

This study examines the market reaction to the publication of integrated reports for a sample of 316 global companies for the reporting year 2018. Applying event study methodology, we find significant cumulative average abnormal returns (CAARs) after the publication date. To ensure robust estimation resultsthe three-factor model, according to Fama and French, is used as well as a market-adjusted model, a CAPM and a Frama-French model taking GARCH effects into account. We find a significant positive CAAR after the publication day of the integrated report. Our results suggest that investors react to information provided in the integrated report and that they react differently to the annual financial report. Furthermore, our cross-sectional analysis confirms that companies with a significant positive cumulative average abnormal show certain characteristic. It was found that European companies have a higher likelihood to experience a stronger significant positive market reaction to their integrated report publication.

Keywords: integrated report, event methodology, cumulative abnormal return, sustainability, CAPM

Procedia PDF Downloads 118
19687 Performance and Limitations of Likelihood Based Information Criteria and Leave-One-Out Cross-Validation Approximation Methods

Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer

Abstract:

Model assessment, in the Bayesian context, involves evaluation of the goodness-of-fit and the comparison of several alternative candidate models for predictive accuracy and improvements. In posterior predictive checks, the data simulated under the fitted model is compared with the actual data. Predictive model accuracy is estimated using information criteria such as the Akaike information criterion (AIC), the Bayesian information criterion (BIC), the Deviance information criterion (DIC), and the Watanabe-Akaike information criterion (WAIC). The goal of an information criterion is to obtain an unbiased measure of out-of-sample prediction error. Since posterior checks use the data twice; once for model estimation and once for testing, a bias correction which penalises the model complexity is incorporated in these criteria. Cross-validation (CV) is another method used for examining out-of-sample prediction accuracy. Leave-one-out cross-validation (LOO-CV) is the most computationally expensive variant among the other CV methods, as it fits as many models as the number of observations. Importance sampling (IS), truncated importance sampling (TIS) and Pareto-smoothed importance sampling (PSIS) are generally used as approximations to the exact LOO-CV and utilise the existing MCMC results avoiding expensive computational issues. The reciprocals of the predictive densities calculated over posterior draws for each observation are treated as the raw importance weights. These are in turn used to calculate the approximate LOO-CV of the observation as a weighted average of posterior densities. In IS-LOO, the raw weights are directly used. In contrast, the larger weights are replaced by their modified truncated weights in calculating TIS-LOO and PSIS-LOO. Although, information criteria and LOO-CV are unable to reflect the goodness-of-fit in absolute sense, the differences can be used to measure the relative performance of the models of interest. However, the use of these measures is only valid under specific circumstances. This study has developed 11 models using normal, log-normal, gamma, and student’s t distributions to improve the PCR stutter prediction with forensic data. These models are comprised of four with profile-wide variances, four with locus specific variances, and three which are two-component mixture models. The mean stutter ratio in each model is modeled as a locus specific simple linear regression against a feature of the alleles under study known as the longest uninterrupted sequence (LUS). The use of AIC, BIC, DIC, and WAIC in model comparison has some practical limitations. Even though, IS-LOO, TIS-LOO, and PSIS-LOO are considered to be approximations of the exact LOO-CV, the study observed some drastic deviations in the results. However, there are some interesting relationships among the logarithms of pointwise predictive densities (lppd) calculated under WAIC and the LOO approximation methods. The estimated overall lppd is a relative measure that reflects the overall goodness-of-fit of the model. Parallel log-likelihood profiles for the models conditional on equal posterior variances in lppds were observed. This study illustrates the limitations of the information criteria in practical model comparison problems. In addition, the relationships among LOO-CV approximation methods and WAIC with their limitations are discussed. Finally, useful recommendations that may help in practical model comparisons with these methods are provided.

Keywords: cross-validation, importance sampling, information criteria, predictive accuracy

Procedia PDF Downloads 364
19686 Modeling of Traffic Turning Movement

Authors: Michael Tilahun Mulugeta

Abstract:

Pedestrians are the most vulnerable road users as they are more exposed to the risk of collusion. Pedestrian safety at road intersections still remains the most vital and yet unsolved issue in Addis Ababa, Ethiopia. One of the critical points in pedestrian safety is the occurrence of conflict between turning vehicle and pedestrians at un-signalized intersection. However, a better understanding of the factors that affect the likelihood of the conflicts would help provide direction for countermeasures aimed at reducing the number of crashes. This paper has sorted to explore a model to describe the relation between traffic conflicts and influencing factors using Multiple Linear regression methodology. In this research the main focus is to study the interaction of turning (left & right) vehicle with pedestrian at unsignalized intersections. The specific objectives also to determine factors that affect the number of potential conflicts and develop a model of potential conflict.

Keywords: potential, regression analysis, pedestrian, conflicts

Procedia PDF Downloads 26
19685 An Inquiry into the Usage of Complex Systems Models to Examine the Effects of the Agent Interaction in a Political Economic Environment

Authors: Ujjwall Sai Sunder Uppuluri

Abstract:

Group theory is a powerful tool that researchers can use to provide a structural foundation for their Agent Based Models. These Agent Based models are argued by this paper to be the future of the Social Science Disciplines. More specifically, researchers can use them to apply evolutionary theory to the study of complex social systems. This paper illustrates one such example of how theoretically an Agent Based Model can be formulated from the application of Group Theory, Systems Dynamics, and Evolutionary Biology to analyze the strategies pursued by states to mitigate risk and maximize usage of resources to achieve the objective of economic growth. This example can be applied to other social phenomena and this makes group theory so useful to the analysis of complex systems, because the theory provides the mathematical formulaic proof for validating the complex system models that researchers build and this will be discussed by the paper. The aim of this research, is to also provide researchers with a framework that can be used to model political entities such as states on a 3-dimensional plane. The x-axis representing resources (tangible and intangible) available to them, y the risks, and z the objective. There also exist other states with different constraints pursuing different strategies to climb the mountain. This mountain’s environment is made up of risks the state faces and resource endowments. This mountain is also layered in the sense that it has multiple peaks that must be overcome to reach the tallest peak. A state that sticks to a single strategy or pursues a strategy that is not conducive to the climbing of that specific peak it has reached is not able to continue advancement. To overcome the obstacle in the state’s path, it must innovate. Based on the definition of a group, we can categorize each state as being its own group. Each state is a closed system, one which is made up of micro level agents who have their own vectors and pursue strategies (actions) to achieve some sub objectives. The state also has an identity, the inverse being anarchy and/or inaction. Finally, the agents making up a state interact with each other through competition and collaboration to mitigate risks and achieve sub objectives that fall within the primary objective. Thus, researchers can categorize the state as an organism that reflects the sum of the output of the interactions pursued by agents at the micro level. When states compete, they employ a strategy and that state which has the better strategy (reflected by the strategies pursued by her parts) is able to out-compete her counterpart to acquire some resource, mitigate some risk or fulfil some objective. This paper will attempt to illustrate how group theory combined with evolutionary theory and systems dynamics can allow researchers to model the long run development, evolution, and growth of political entities through the use of a bottom up approach.

Keywords: complex systems, evolutionary theory, group theory, international political economy

Procedia PDF Downloads 109
19684 Point Estimation for the Type II Generalized Logistic Distribution Based on Progressively Censored Data

Authors: Rana Rimawi, Ayman Baklizi

Abstract:

Skewed distributions are important models that are frequently used in applications. Generalized distributions form a class of skewed distributions and gain widespread use in applications because of their flexibility in data analysis. More specifically, the Generalized Logistic Distribution with its different types has received considerable attention recently. In this study, based on progressively type-II censored data, we will consider point estimation in type II Generalized Logistic Distribution (Type II GLD). We will develop several estimators for its unknown parameters, including maximum likelihood estimators (MLE), Bayes estimators and linear estimators (BLUE). The estimators will be compared using simulation based on the criteria of bias and Mean square error (MSE). An illustrative example of a real data set will be given.

Keywords: point estimation, type II generalized logistic distribution, progressive censoring, maximum likelihood estimation

Procedia PDF Downloads 167
19683 A Frictional-Collisional Closure Model for the Saturated Granular Flow: Experimental Evidence and Two Phase Modelling

Authors: Yunhui Sun, Qingquan Liu, Xiaoliang Wang

Abstract:

Dense granular flows widely exist in geological flows such as debris flow, landslide, or sheet flow, where both the interparticle and solid-liquid interactions are important to modify the flow. So, a two-phase approach with both phases correctly modelled is important for a better investigation of the saturated granular flows. However, a proper closure model covering a wide range of flowing states for the solid phase is still lacking. This study first employs a chute flow experiment based on the refractive index matching method, which makes it possible to obtain internal flow information such as velocity, shear rate, granular fluctuation, and volume fraction. The granular stress is obtained based on a steady assumption. The kinetic theory is found to describe the stress dependence on the flow state well. More importantly, the granular rheology is found to be frictionally dominated under weak shear and collisionally dominated under strong shear. The results presented thus provide direct experimental evidence on a possible frictional-collisional closure model for the granular phase. The data indicates that both frictional stresses exist over a wide range of the volume fraction, though traditional theory believes it vanishes below a critical volume fraction. Based on the findings, a two-phase model is used to simulate the chute flow. Both phases are modelled as continuum media, and the inter-phase interactions, such as drag force and pressure gradient force, are considered. The frictional-collisional model is used for the closure of the solid phase stress. The profiles of the kinematic properties agree well with the experiments. This model is further used to simulate immersed granular collapse, which is unsteady in nature, to study the applicability of this model, which is derived from steady flow.

Keywords: closure model, collision, friction, granular flow, two-phase model

Procedia PDF Downloads 33
19682 3D Printing Perceptual Models of Preference Using a Fuzzy Extreme Learning Machine Approach

Authors: Xinyi Le

Abstract:

In this paper, 3D printing orientations were determined through our perceptual model. Some FDM (Fused Deposition Modeling) 3D printers, which are widely used in universities and industries, often require support structures during the additive manufacturing. After removing the residual material, some surface artifacts remain at the contact points. These artifacts will damage the function and visual effect of the model. To prevent the impact of these artifacts, we present a fuzzy extreme learning machine approach to find printing directions that avoid placing supports in perceptually significant regions. The proposed approach is able to solve the evaluation problem by combing both the subjective knowledge and objective information. Our method combines the advantages of fuzzy theory, auto-encoders, and extreme learning machine. Fuzzy set theory is applied for dealing with subjective preference information, and auto-encoder step is used to extract good features without supervised labels before extreme learning machine. An extreme learning machine method is then developed successfully for training and learning perceptual models. The performance of this perceptual model will be demonstrated on both natural and man-made objects. It is a good human-computer interaction practice which draws from supporting knowledge on both the machine side and the human side.

Keywords: 3d printing, perceptual model, fuzzy evaluation, data-driven approach

Procedia PDF Downloads 405
19681 Guests’ Satisfaction and Intention to Revisit Smart Hotels: Qualitative Interviews Approach

Authors: Raymond Chi Fai Si Tou, Jacey Ja Young Choe, Amy Siu Ian So

Abstract:

Smart hotels can be defined as the hotel which has an intelligent system, through digitalization and networking which achieve hotel management and service information. In addition, smart hotels include high-end designs that integrate information and communication technology with hotel management fulfilling the guests’ needs and improving the quality, efficiency and satisfaction of hotel management. The purpose of this study is to identify appropriate factors that may influence guests’ satisfaction and intention to revisit Smart Hotels based on service quality measurement of lodging quality index and extended UTAUT theory. Unified Theory of Acceptance and Use of Technology (UTAUT) is adopted as a framework to explain technology acceptance and use. Since smart hotels are technology-based infrastructure hotels, UTATU theory could be as the theoretical background to examine the guests’ acceptance and use after staying in smart hotels. The UTAUT identifies four key drivers of the adoption of information systems: performance expectancy, effort expectancy, social influence, and facilitating conditions. The extended UTAUT modifies the definitions of the seven constructs for consideration; the four previously cited constructs of the UTAUT model together with three new additional constructs, which including hedonic motivation, price value and habit. Thus, the seven constructs from the extended UTAUT theory could be adopted to understand their intention to revisit smart hotels. The service quality model will also be adopted and integrated into the framework to understand the guests’ intention of smart hotels. There are rare studies to examine the service quality on guests’ satisfaction and intention to revisit in smart hotels. In this study, Lodging Quality Index (LQI) will be adopted to measure the service quality in smart hotels. Using integrated UTAUT theory and service quality model because technological applications and services require using more than one model to understand the complicated situation for customers’ acceptance of new technology. Moreover, an integrated model could provide more perspective insights to explain the relationships of the constructs that could not be obtained from only one model. For this research, ten in-depth interviews are planned to recruit this study. In order to confirm the applicability of the proposed framework and gain an overview of the guest experience of smart hotels from the hospitality industry, in-depth interviews with the hotel guests and industry practitioners will be accomplished. In terms of the theoretical contribution, it predicts that the integrated models from the UTAUT theory and the service quality will provide new insights to understand factors that influence the guests’ satisfaction and intention to revisit smart hotels. After this study identifies influential factors, smart hotel practitioners could understand which factors may significantly influence smart hotel guests’ satisfaction and intention to revisit. In addition, smart hotel practitioners could also provide outstanding guests experience by improving their service quality based on the identified dimensions from the service quality measurement. Thus, it will be beneficial to the sustainability of the smart hotels business.

Keywords: intention to revisit, guest satisfaction, qualitative interviews, smart hotels

Procedia PDF Downloads 174
19680 The Impact of Job Meaningfulness on the Relationships between Job Autonomy, Supportive Organizational Climate, and Job Satisfaction

Authors: Sashank Nyapati, Laura Lorente-Prieto, Maria Peiro

Abstract:

The general objective of this study is to analyse the mediating role of meaningfulness in the relationships between job autonomy and job satisfaction and supportive organizational climate and job satisfaction. Theories such as the Job Characteristics Model, Conservation of Resources theory, as well as the Job Demands-Resources theory were used as theoretical framework. Data was obtained from the 5th European Working Conditions Survey (EWCS), and sample was composed of 1005 and 1000 workers from Spain and Portugal respectively. The analysis was conducted using the SOBEL Macro for SPSS (A multiple regression mediation model) developed by Preacher and Hayes in 2003. Results indicated that Meaningfulness partially mediates both the Job Autonomy-Job Satisfaction as well as the Supportive Organizational Climate-Job Satisfaction relationships. However, the percentages are large enough to draw substantial conclusions, especially that Job Meaningfulness plays an essential – if indirect – role in the amount of Satisfaction that one experiences at work. Some theoretical and practical implications are discussed.

Keywords: meaningfulness, job autonomy, supportive organizational climate, job satisfaction

Procedia PDF Downloads 506
19679 Generating Music with More Refined Emotions

Authors: Shao-Di Feng, Von-Wun Soo

Abstract:

To generate symbolic music with specific emotions is a challenging task due to symbolic music datasets that have emotion labels are scarce and incomplete. This research aims to generate more refined emotions based on the training datasets that are only labeled with four quadrants in Russel’s 2D emotion model. We focus on the theory of Music Fadernet and map arousal and valence to the low-level attributes, and build a symbolic music generation model by combining transformer and GM-VAE. We adopt an in-attention mechanism for the model and improve it by allowing modulation by conditional information. And we show the music generation model could control the generation of music according to the emotions specified by users in terms of high-level linguistic expression and by manipulating their corresponding low-level musical attributes. Finally, we evaluate the model performance using a pre-trained emotion classifier against a pop piano midi dataset called EMOPIA, and by subjective listening evaluation, we demonstrate that the model could generate music with more refined emotions correctly.

Keywords: music generation, music emotion controlling, deep learning, semi-supervised learning

Procedia PDF Downloads 58
19678 Active Flutter Suppression of Sports Aircraft Tailplane by Supplementary Control Surface

Authors: Aleš Kratochvíl, Svatomír Slavík

Abstract:

The paper presents an aircraft flutter suppression by active damping of supplementary control surface at trailing edge. The mathematical model of thin oscillation airfoil with control surface driven by pilot is developed. The supplementary control surface driven by control law is added. Active damping of flutter by several control law is present. The structural model of tailplane with an aerodynamic strip theory based on the airfoil model is developed by a finite element method. The optimization process of stiffens parameters is carried out to match the structural model with results from a ground vibration test of a small sport airplane. The implementation of supplementary control surface driven by control law is present. The active damping of tailplane model is shown.

Keywords: active damping, finite element method, flutter, tailplane model

Procedia PDF Downloads 268
19677 The Problem of Now in Special Relativity Theory

Authors: Mogens Frank Mikkelsen

Abstract:

Special Relativity Theory (SRT) includes only one characteristic of light, the speed is equal to all observers, and by excluding other relevant characteristics of light, the common interpretation of SRT should be regarded as merely an approximative theory. By rethinking the iconic double light cones, a revised version of SRT can be developed. The revised concept of light cones acknowledges an asymmetry of past and future light cones and introduced a concept of the extended past to explain the predictions as something other than the future. Combining this with the concept of photon-paired events, led to the inference that Special Relativity theory can support the existence of Now.

Keywords: relativity, light cone, Minkowski, time

Procedia PDF Downloads 50
19676 A Novel Geometrical Approach toward the Mechanical Properties of Particle Reinforced Composites

Authors: Hamed Khezrzadeh

Abstract:

Many investigations on the micromechanical structure of materials indicate that there exist fractal patterns at the micro scale in some of the main construction and industrial materials. A recently presented micro-fractal theory brings together the well-known periodic homogenization and the fractal geometry to construct an appropriate model for determination of the mechanical properties of particle reinforced composite materials. The proposed multi-step homogenization scheme considers the mechanical properties of different constituent phases in the composite together with the interaction between these phases throughout a step-by-step homogenization technique. In the proposed model the interaction of different phases is also investigated. By using this method the effect of fibers grading on the mechanical properties also could be studied. The theory outcomes are compared to the experimental data for different types of particle-reinforced composites which very good agreement with the experimental data is observed.

Keywords: fractal geometry, homogenization, micromehcanics, particulate composites

Procedia PDF Downloads 262
19675 A New Developed Formula to Determine the Shear Buckling Stress in Welded Aluminum Plate Girders

Authors: Badr Alsulami, Ahmed S. Elamary

Abstract:

This paper summarizes and presents main results of an in-depth numerical analysis dealing with the shear buckling resistance of aluminum plate girders. The studies conducted have permitted the development of a simple design expression to determine the critical shear buckling stress in aluminum web panels. This expression takes into account the effects of reduction of strength in aluminum alloys due to the welding process. Ultimate shear resistance (USR) of plate girders can be obtained theoretically using Cardiff theory or Hӧglund’s theory. USR of aluminum alloy plate girders predicted theoretically using BS8118 appear inconsistent when compared with test data. Theoretical predictions based on Hӧglund’s theory, are more realistic. Cardiff theory proposed to predict the USR of steel plate girders only. Welded aluminum alloy plate girders studied experimentally by others; the USR resulted from tests are reviewed. Comparison between the test results with the values obtained from Hӧglund’s theory, BS8118 design method, and Cardiff theory performed theoretically. Finally, a new equation based on Cardiff tension-field theory proposed to predict theoretically the USR of aluminum plate girders.

Keywords: shear resistance, aluminum, Cardiff theory, Hӧglund's theory, plate girder

Procedia PDF Downloads 384
19674 Men's Intimate Violence: Theory and Practice Relationship

Authors: Omer Zvi Shaked

Abstract:

Intimate Partner Violence (IPV) is a widespread social problem. Since the 1970's, and due to political changes resulting from the feminist movement, western society has been changing its attitude towards the phenomenon and has been taking an active approach to reduce its magnitude. Enterprises in the form of legislation, awareness and prevention campaigns, women's shelters, and community intervention programs became more prevalent as years progressed. Although many initiatives were found to be productive, the effectiveness of one, however, remained questionable throughout the years: intervention programs for men's intimate violence. Surveys outline two main intervention models for men's intimate violence. The first is the Duluth model, which argued that men are socialized to be dominant - while women are socialized to be subordinate - and men are therefore required by social imperative to enforce, physically if necessary, their dominance. The Duluth model became the chief authorized intervention program, and some states in the US even regulated it as the standard criminal justice program for men's intimate violence. However, meta-analysis findings demonstrated that based on a partner's reports, Duluth treatment completers have 44% recidivism rate, and between 40% and 85% dropout range. The second model is the Cognitive-Behavioral Model (CBT), which is a highly accepted intervention worldwide. The model argues that cognitive misrepresentations of intimate situations precede violent behaviors frequently when anger predisposition exists. Since anger dysregulation mediates between one's cognitive schemes and violent response, anger regulation became the chief purpose of the intervention. Yet, a meta-analysis found only a 56% risk reduction for CBT interventions. It is, therefore, crucial to understand the background behind the domination of both the Duluth model and CBT interventions. This presentation will discuss the ways in which theoretical conceptualizations of men's intimate violence, as well as ideologies, had contributed to the above-mentioned interventions' wide acceptance, despite known lack of scientific and evidential support. First, the presentation will review the prominent interventions for male intimate violence, the Duluth model, and CBT. Second, the presentation will review the prominent theoretical models explaining men's intimate violence: The Patriarchal model, the Abusive Personality model, and the Post-Traumatic Stress model. Third, the presentation will discuss the interrelation between theory and practice, and the nature of affinity between research and practice regarding men's intimate violence. Finally, the presentation will set new directions for further research, aiming to improve intervention's efficiency with men's intimate violence and advance social work practice in the field.

Keywords: intimate partner violence, theory and practice relationship, Duluth, CBT, abusive personality, post-traumatic stress

Procedia PDF Downloads 107
19673 Effect of Normal Deformation on the Stability of Sandwich Beams Simply Supported Using a Refined Four-Variable Beam Theory

Authors: R. Bennai, M. Nebab, H. Ait Atmane, B. Ayache, H. Fourn

Abstract:

In this work, a study of the stability of a functionally graduated sandwiches beam using a refined theory of hyperbolic shear deformation of a beam was developed. The effects of transverse shear strains and the transverse normal deformation are considered. The constituent materials of the beam are supposed gradually variable depending on the height direction based on a simple power distribution law in terms of the volume fractions of the constituents; the two materials with which we worked are metals and ceramics. In order to examine the present model, illustrative examples are presented to show the effects of changes in different parameters such as the material graduation, the stretching effect of the thickness and thickness ratio –length on the buckling of FGM sandwich beams.

Keywords: FGM materials, refined shear deformation theory, stretching effect, buckling, boundary conditions

Procedia PDF Downloads 155
19672 Analysis of Complex Business Negotiations: Contributions from Agency-Theory

Authors: Jan Van Uden

Abstract:

The paper reviews classical agency-theory and its contributions to the analysis of complex business negotiations and gives an approach for the modification of the basic agency-model in order to examine the negotiation specific dimensions of agency-problems. By illustrating fundamental potentials for the modification of agency-theory in context of business negotiations the paper highlights recent empirical research that investigates agent-based negotiations and inter-team constellations. A general theoretical analysis of complex negotiation would be based on a two-level approach. First, the modification of the basic agency-model in order to illustrate the organizational context of business negotiations (i.e., multi-agent issues, common-agencies, multi-period models and the concept of bounded rationality). Second, the application of the modified agency-model on complex business negotiations to identify agency-problems and relating areas of risk in the negotiation process. The paper is placed on the first level of analysis – the modification. The method builds on the one hand on insights from behavior decision research (BRD) and on the other hand on findings from agency-theory as normative directives to the modification of the basic model. Through neoclassical assumptions concerning the fundamental aspects of agency-relationships in business negotiations (i.e., asymmetric information, self-interest, risk preferences and conflict of interests), agency-theory helps to draw solutions on stated worst-case-scenarios taken from the daily negotiation routine. As agency-theory is the only universal approach able to identify trade-offs between certain aspects of economic cooperation, insights obtained provide a deeper understanding of the forces that shape business negotiation complexity. The need for a modification of the basic model is illustrated by highlighting selected issues of business negotiations from agency-theory perspective: Negotiation Teams require a multi-agent approach under the condition that often decision-makers as superior-agents are part of the team. The diversity of competences and decision-making authority is a phenomenon that overrides the assumptions of classical agency-theory and varies greatly in context of certain forms of business negotiations. Further, the basic model is bound to dyadic relationships preceded by the delegation of decision-making authority and builds on a contractual created (vertical) hierarchy. As a result, horizontal dynamics within the negotiation team playing an important role for negotiation success are therefore not considered in the investigation of agency-problems. Also, the trade-off between short-term relationships within the negotiation sphere and the long-term relationships of the corporate sphere calls for a multi-period perspective taking into account the sphere-specific governance-mechanisms already established (i.e., reward and monitoring systems). Within the analysis, the implementation of bounded rationality is closely related to findings from BRD to assess the impact of negotiation behavior on underlying principal-agent-relationships. As empirical findings show, the disclosure and reservation of information to the agent affect his negotiation behavior as well as final negotiation outcomes. Last, in context of business negotiations, asymmetric information is often intended by decision-makers acting as superior-agents or principals which calls for a bilateral risk-approach to agency-relations.

Keywords: business negotiations, agency-theory, negotiation analysis, interteam negotiations

Procedia PDF Downloads 114
19671 Modeling of the Heat and Mass Transfer in Fluids through Thermal Pollution in Pipelines

Authors: V. Radulescu, S. Dumitru

Abstract:

Introduction: Determination of the temperature field inside a fluid in motion has many practical issues, especially in the case of turbulent flow. The phenomenon is greater when the solid walls have a different temperature than the fluid. The turbulent heat and mass transfer have an essential role in case of the thermal pollution, as it was the recorded during the damage of the Thermoelectric Power-plant Oradea (closed even today). Basic Methods: Solving the theoretical turbulent thermal pollution represents a particularly difficult problem. By using the semi-empirical theories or by simplifying the made assumptions, based on the experimental measurements may be assured the elaboration of the mathematical model for further numerical simulations. The three zones of flow are analyzed separately: the vicinity of the solid wall, the turbulent transition zone, and the turbulent core. For each area are determined the distribution law of temperature. It is determined the dependence of between the Stanton and Prandtl numbers with correction factors, based on measurements experimental. Major Findings/Results: The limitation of the laminar thermal substrate was determined based on the theory of Landau and Levice, using the assumption that the longitudinal component of the velocity pulsation and the pulsation’s frequency varies proportionally with the distance to the wall. For the calculation of the average temperature, the formula is used a similar solution as for the velocity, by an analogous mediation. On these assumptions, the numerical modeling was performed with a gradient of temperature for the turbulent flow in pipes (intact or damaged, with cracks) having 4 different diameters, between 200-500 mm, as there were in the Thermoelectric Power-plant Oradea. Conclusions: It was made a superposition between the molecular viscosity and the turbulent one, followed by addition between the molecular and the turbulent transfer coefficients, necessary to elaborate the theoretical and the numerical modeling. The concept of laminar boundary layer has a different thickness when it is compared the flow with heat transfer and that one without a temperature gradient. The obtained results are within the margin of error of 5%, between the semi-empirical classical theories and the developed model, based on the experimental data. Finally, it is obtained a general correlation between the Stanton number and the Prandtl number, for a specific flow (with associated Reynolds number).

Keywords: experimental measurements, numerical correlations, thermal pollution through pipelines, turbulent thermal flow

Procedia PDF Downloads 134
19670 Construction of a Low Carbon Eco-City Index System Based on CAS Theory: A Case of Hexi Newtown in Nanjing, China

Authors: Xu Tao, Yilun Xu, Dingwei Xiang, Yaofei Sun

Abstract:

The practice of urban planning and construction based on the concept of the “low carbon eco-city” has been universally accepted by the academic community in response to urban issues such as population, resources, environment, and social development. Based on this, the current article first analyzes the concepts of low carbon eco-city, then builds a complex adaptive system (CAS) theory based on Chinese traditional philosophical thinking, and analyzes the adaptive relationship between material and non-material elements. A three-dimensional evaluation model of natural ecology, economic low carbon, and social harmony was constructed. Finally, the construction of a low carbon eco-city index system in Hexi Newtown of Nanjing was used as an example to verify the effectiveness of the research results; this paradigm provides a new way to achieve a low carbon eco-city system.

Keywords: complex adaptive system, low carbon ecology, index system, model

Procedia PDF Downloads 132
19669 System Identification and Quantitative Feedback Theory Design of a Lathe Spindle

Authors: M. Khairudin

Abstract:

This paper investigates the system identification and design quantitative feedback theory (QFT) for the robust control of a lathe spindle. The dynamic of the lathe spindle is uncertain and time variation due to the deepness variation on cutting process. System identification was used to obtain the dynamics model of the lathe spindle. In this work, real time system identification is used to construct a linear model of the system from the nonlinear system. These linear models and its uncertainty bound can then be used for controller synthesis. The real time nonlinear system identification process to obtain a set of linear models of the lathe spindle that represents the operating ranges of the dynamic system. With a selected input signal, the data of output and response is acquired and nonlinear system identification is performed using Matlab to obtain a linear model of the system. Practical design steps are presented in which the QFT-based conditions are formulated to obtain a compensator and pre-filter to control the lathe spindle. The performances of the proposed controller are evaluated in terms of velocity responses of the the lathe machine spindle in corporating deepness on cutting process.

Keywords: lathe spindle, QFT, robust control, system identification

Procedia PDF Downloads 515
19668 A Computational Model of the Thermal Grill Illusion: Simulating the Perceived Pain Using Neuronal Activity in Pain-Sensitive Nerve Fibers

Authors: Subhankar Karmakar, Madhan Kumar Vasudevan, Manivannan Muniyandi

Abstract:

Thermal Grill Illusion (TGI) elicits a strong and often painful sensation of burn when interlacing warm and cold stimuli that are individually non-painful, excites thermoreceptors beneath the skin. Among several theories of TGI, the “disinhibition” theory is the most widely accepted in the literature. According to this theory, TGI is the result of the disinhibition or unmasking of the pain-sensitive HPC (Heat-Pinch-Cold) nerve fibers due to the inhibition of cold-sensitive nerve fibers that are responsible for masking HPC nerve fibers. Although researchers focused on understanding TGI throughexperiments and models, none of them investigated the prediction of TGI pain intensity through a computational model. Furthermore, the comparison of psychophysically perceived TGI intensity with neurophysiological models has not yet been studied. The prediction of pain intensity through a computational model of TGI can help inoptimizing thermal displays and understanding pathological conditions related to temperature perception. The current studyfocuses on developing a computational model to predict the intensity of TGI pain and experimentally observe the perceived TGI pain. The computational model is developed based on the disinhibition theory and by utilizing the existing popular models of warm and cold receptors in the skin. The model aims to predict the neuronal activity of the HPC nerve fibers. With a temperature-controlled thermal grill setup, fifteen participants (ten males and five females) were presented with five temperature differences between warm and cold grills (each repeated three times). All the participants rated the perceived TGI pain sensation on a scale of one to ten. For the range of temperature differences, the experimentally observed perceived intensity of TGI is compared with the neuronal activity of pain-sensitive HPC nerve fibers. The simulation results show a monotonically increasing relationship between the temperature differences and the neuronal activity of the HPC nerve fibers. Moreover, a similar monotonically increasing relationship is experimentally observed between temperature differences and the perceived TGI intensity. This shows the potential comparison of TGI pain intensity observed through the experimental study with the neuronal activity predicted through the model. The proposed model intends to bridge the theoretical understanding of the TGI and the experimental results obtained through psychophysics. Further studies in pain perception are needed to develop a more accurate version of the current model.

Keywords: thermal grill Illusion, computational modelling, simulation, psychophysics, haptics

Procedia PDF Downloads 139
19667 From Sampling to Sustainable Phosphate Recovery from Mine Waste Rock Piles

Authors: Hicham Amar, Mustapha El Ghorfi, Yassine Taha, Abdellatif Elghali, Rachid Hakkou, Mostafa Benzaazoua

Abstract:

Phosphate mine waste rock (PMWR) generated during ore extraction is continuously increasing, resulting in a significant environmental footprint. The main objectives of this study consist of i) elaboration of the sampling strategy of PMWR piles, ii) a mineralogical and chemical characterization of PMWR piles, and iii) 3D block model creation to evaluate the potential valorization of the existing PMWR. Destructive drilling using reverse circulation from 13 drills was used to collect samples for chemical (X-ray fluorescence analysis) and mineralogical assays. The 3D block model was created based on the data set, including chemical data of the realized drills using Datamine RM software. The optical microscopy observations showed that the sandy phosphate from drills in the PMWR piles is characterized by the abundance of carbonate fluorapatite with the presence of calcite, dolomite, and quartz. The mean grade of composite samples was around 19.5±2.7% for P₂O₅. The mean grade of P₂O₅ exhibited an increasing tendency by depth profile from bottom to top of PMWR piles. 3D block model generated with chemical data confirmed the tendency of the mean grades’ variation and may allow a potential selective extraction according to %P₂O₅. The 3D block model of P₂O₅ grade is an efficient sampling approach that confirmed the variation of P₂O₅ grade. This integrated approach for PMWR management will be a helpful tool for decision-making to recover the residual phosphate, adopting the circular economy and sustainability in the phosphate mining industry.

Keywords: 3D modelling, reverse circulation drilling, circular economy, phosphate mine waste rock, sampling

Procedia PDF Downloads 45
19666 Comparison of Two Theories for the Critical Laser Radius in Thermal Quantum Plasma

Authors: Somaye Zare

Abstract:

The critical beam radius is a significant factor that predicts the behavior of the laser beam in the plasma, so if the laser beam radius is adequately greater in comparison to it, the beam will experience stable focusing on the plasma; otherwise, the beam will diverge after entering into the plasma. In this work, considering the paraxial approximation and moment theories, the localization of a relativistic laser beam in thermal quantum plasma is investigated. Using the dielectric function obtained in the quantum hydrodynamic model, the mathematical equation for the laser beam width parameter is attained and solved numerically by the fourth-order Runge-Kutta method. The results demonstrate that the stouter focusing effect is occurred in the moment theory compared to the paraxial approximation. Besides, similar to the two theories, with increasing Fermi temperature, plasma density, and laser intensity, the oscillation rate of the beam width parameter growths and focusing length reduces which means improving the focusing effect. Furthermore, it is understood that behaviors of the critical laser radius are different in the two theories, in the paraxial approximation, the critical radius after a minimum value is enhanced with increasing laser intensity, but in the moment theory, with increasing laser intensity, the critical radius decreases until it becomes independent of the laser intensity.

Keywords: laser localization, quantum plasma, paraxial approximation, moment theory, quantum hydrodynamic model

Procedia PDF Downloads 46