Search results for: computational complexity theory
7810 Modal Density Influence on Modal Complexity Quantification in Dynamic Systems
Authors: Fabrizio Iezzi, Claudio Valente
Abstract:
The viscous damping in dynamic systems can be proportional or non-proportional. In the first case, the mode shapes are real whereas in the second case they are complex. From an engineering point of view, the complexity of the mode shapes is important in order to quantify the non-proportional damping. Different indices exist to provide estimates of the modal complexity. These indices are or not zero, depending whether the mode shapes are not or are complex. The modal density problem arises in the experimental identification when the dynamic systems have close modal frequencies. Depending on the entity of this closeness, the mode shapes can hold fictitious imaginary quantities that affect the values of the modal complexity indices. The results are the failing in the identification of the real or complex mode shapes and then of the proportional or non-proportional damping. The paper aims to show the influence of the modal density on the values of these indices in case of both proportional and non-proportional damping. Theoretical and pseudo-experimental solutions are compared to analyze the problem according to an appropriate mechanical system.Keywords: complex mode shapes, dynamic systems identification, modal density, non-proportional damping
Procedia PDF Downloads 3887809 Contextualizing Theory Z of Motivation Among Indian Universities of Higher Education
Authors: Janani V., Tanika Singh, Bala Subramanian R., Santosh Kumar Sharma
Abstract:
Higher education across the globe is undergoing a sea change. This has created a varied management of higher education in Indian universities, and therefore, we find no universal law regarding HR policies and practices in these universities. As a result, faculty retention is very low, which is a serious concern for educational leaders such as vice-chancellors or directors working in the higher education sector. We can understand this phenomenon in the light of various management theories, among which theory z proposed by William Ouchi is a prominent one. With this backdrop, the present article strives to contextualize theory z in Indian higher education. For the said purpose, qualitative methodology has been adopted, and accordingly, propositions have been generated. We believe that this article will motivate other researchers to empirically test the generated propositions and thereby contribute in the existing literature.Keywords: education, managemenet, motivation, Theory X, Theory Y, Theory Z, faculty members, universities, India
Procedia PDF Downloads 1167808 Introduction to Various Innovative Techniques Suggested for Seismic Hazard Assessment
Authors: Deepshikha Shukla, C. H. Solanki, Mayank K. Desai
Abstract:
Amongst all the natural hazards, earthquakes have the potential for causing the greatest damages. Since the earthquake forces are random in nature and unpredictable, the quantification of the hazards becomes important in order to assess the hazards. The time and place of a future earthquake are both uncertain. Since earthquakes can neither be prevented nor be predicted, engineers have to design and construct in such a way, that the damage to life and property are minimized. Seismic hazard analysis plays an important role in earthquake design structures by providing a rational value of input parameter. In this paper, both mathematical, as well as computational methods adopted by researchers globally in the past five years, will be discussed. Some mathematical approaches involving the concepts of Poisson’s ratio, Convex Set Theory, Empirical Green’s Function, Bayesian probability estimation applied for seismic hazard and FOSM (first-order second-moment) algorithm methods will be discussed. Computational approaches and numerical model SSIFiBo developed in MATLAB to study dynamic soil-structure interaction problem is discussed in this paper. The GIS-based tool will also be discussed which is predominantly used in the assessment of seismic hazards.Keywords: computational methods, MATLAB, seismic hazard, seismic measurements
Procedia PDF Downloads 3417807 Logic of the Prospect Theory: The Decision Making Process of the First Gulf War and the Crimean Annexation
Authors: Zhengyang Ma, Zhiyao Li, Jiayi Zhang
Abstract:
This article examines the prospect theory’s arguments about decision-making through two case studies, the First Gulf War and Russia’s annexation of Crimea. The article uses the methods of comparative case analysis and process tracing to investigate the prospect theory’s fundamental arguments. Through evidence derived from existing primary and secondary sources, this paper argues that both former U.S. President Bush and Russian President Putin viewed their situations as a domain of loss and made risky decisions to prevent further deterioration, which attests the arguments of the prospect theory. After the two case studies, this article also discusses how the prospect theory could be used in analyzing the decision-making process that led to the current Russia-Ukraine War.Keywords: the prospect theory, international relations, the first gulf war, the crimea crisis
Procedia PDF Downloads 1267806 Green Energy, Fiscal Incentives and Conflicting Signals: Analysing the Challenges Faced in Promoting on Farm Waste to Energy Projects
Authors: Hafez Abdo, Rob Ackrill
Abstract:
Renewable energy (RE) promotion in the UK relies on multiple policy instruments, which are required to overcome the path dependency pressures favouring fossil fuels. These instruments include targeted funding schemes and economy-wide instruments embedded in the tax code. The resulting complexity of incentives raises important questions around the coherence and effectiveness of these instruments for RE generation. This complexity is exacerbated by UK RE policy being nested within EU policy in a multi-level governance (MLG) setting. To gain analytical traction on such complexity, this study will analyse policies promoting the on-farm generation of energy for heat and power, from farm and food waste, via anaerobic digestion. Utilising both primary and secondary data, it seeks to address a particular lacuna in the academic literature. Via a localised, in-depth investigation into the complexity of policy instruments promoting RE, this study will help our theoretical understanding of the challenges that MLG and path dependency pressures present to policymakers of multi-dimensional policies.Keywords: anaerobic digestion, energy, green, policy, renewable, tax, UK
Procedia PDF Downloads 3717805 Determination of Complexity Level in Okike's Merged Irregular Transposition Cipher
Authors: Okike Benjami, Garba Ejd
Abstract:
Today, it has been observed security of information along the superhighway is often compromised by those who are not authorized to have access to such information. In other to ensure the security of information along the superhighway, such information should be encrypted by some means to conceal the real meaning of the information. There are many encryption techniques out there in the market. However, some of these encryption techniques are often decrypted by adversaries with ease. The researcher has decided to develop an encryption technique that may be more difficult to decrypt. This may be achieved by splitting the message to be encrypted into parts and encrypting each part separately and swapping the positions before transmitting the message along the superhighway. The method is termed Okike’s Merged Irregular Transposition Cipher. Also, the research would determine the complexity level in respect to the number of splits of the message.Keywords: transposition cipher, merged irregular cipher, encryption, complexity level
Procedia PDF Downloads 2907804 Molecular Electron Density Theory Study on the Mechanism and Selectivity of the 1,3 Dipolar Cycloaddition Reaction of N-Methyl-C-(2-Furyl) Nitrone with Activated Alkenes
Authors: Moulay Driss Mellaoui, Abdallah Imjjad, Rachid Boutiddar, Haydar Mohammad-Salim, Nivedita Acharjee, Hassan Bourzi, Souad El Issami, Khalid Abbiche, Hanane Zejli
Abstract:
We have investigated the underlying molecular processes involved in the [3+2] cycloaddition (32CA) reactions between N-methyl-C-(2-furyl) nitrone and three acetylene derivatives: 4b, 5b, and 6b. For this investigation, we utilized molecular electron density theory (MEDT) and density functional theory (DFT) methods at the B3LYP-D3/6 31G (d) computational level. These 32CA reactions, which exhibit a zwitterionic (zw-type) nature, proceed through a one-step mechanism with activation enthalpies ranging from 8.80 to 14.37 kcal mol−1 in acetonitrile and ethanol solvents. When the nitrone reacts with phenyl methyl propiolate (4b), two regioisomeric pathways lead to the formation of two products: P1,5-4b and P1,4-4b. On the other hand, when the nitrone reacts with dimethyl acetylene dicarboxylate (5b) and acetylene dicarboxylic acid (but-2-ynedioic acid) (6b), it results in the formation of a single product. Through topological analysis, we can categorize the nitrone as a zwitterionic three-atom component (TAC). Furthermore, the analysis of conceptual density functional theory (CDFT) indices classifies the 32CA reactions of the nitrone with 4b, 5b, and 6b as forward electron density flux (FEDF) reactions. The study of bond evolution theory (BET) reveals that the formation of new C-C and C-O covalent bonds does not initiate in the transition states, as the intermediate stages of these reactions display pseudoradical centers of the atoms already involved in bonding.Keywords: 4-isoxazoline, DFT/B3LYP-D3, regioselectivity, cycloaddition reaction, MEDT, ELF
Procedia PDF Downloads 1847803 Disintegration of Deuterons by Photons Reaction Model for GEANT4 with Dibaryon Formalism
Authors: Jae Won Shin, Chang Ho Hyun
Abstract:
A disintegration of deuterons by photons (dγ → np) reaction model for GEANT4 is developed in this work. An effective field theory with dibaryon fields Introducing a dibaryon field, we can take into account the effective range contribution to the propagator up to infinite order, and it consequently makes the convergence of the theory better than the pionless effective field theory without dibaryon fields. We develop a hadronic model for GEANT4 which is specialized for the disintegration of the deuteron by photons, dγ → np. For the description of two-nucleon interactions, we employ an effective field theory so called pionless theory with dibaryon fields (dEFT). In spite of its simplicity, the theory has proven very effective and useful in the applications to various two-nucleon systems and processes at low energies. We apply the new model of GEANT4 (G4dEFT) to the calculation of total and differential cross sections in dγ → np, and obtain good agreements to experimental data for a wide range of incoming photon energies.Keywords: dγ → np, dibaryon fields, effective field theory, GEANT4
Procedia PDF Downloads 3807802 A Numerical Model Simulation for an Updraft Gasifier Using High-Temperature Steam
Authors: T. M. Ismail, M. A. El-Salam
Abstract:
A mathematical model study was carried out to investigate gasification of biomass fuels using high-temperature air and steam as a gasifying agent using high-temperature air up to 1000°C. In this study, a 2D computational fluid dynamics model was developed to study the gasification process in an updraft gasifier, considering drying, pyrolysis, combustion, and gasification reactions. The gas and solid phases were resolved using a Euler−Euler multiphase approach, with exchange terms for the momentum, mass, and energy. The standard k−ε turbulence model was used in the gas phase, and the particle phase was modeled using the kinetic theory of granular flow. The results show that the present model giving a promising way in its capability and sensitivity for the parameter effects that influence the gasification process.Keywords: computational fluid dynamics, gasification, biomass fuel, fixed bed gasifier
Procedia PDF Downloads 4077801 Efficient Signal Detection Using QRD-M Based on Channel Condition in MIMO-OFDM System
Authors: Jae-Jeong Kim, Ki-Ro Kim, Hyoung-Kyu Song
Abstract:
In this paper, we propose an efficient signal detector that switches M parameter of QRD-M detection scheme is proposed for MIMO-OFDM system. The proposed detection scheme calculates the threshold by 1-norm condition number and then switches M parameter of QRD-M detection scheme according to channel information. If channel condition is bad, the parameter M is set to high value to increase the accuracy of detection. If channel condition is good, the parameter M is set to low value to reduce complexity of detection. Therefore, the proposed detection scheme has better trade off between BER performance and complexity than the conventional detection scheme. The simulation result shows that the complexity of proposed detection scheme is lower than QRD-M detection scheme with similar BER performance.Keywords: MIMO-OFDM, QRD-M, channel condition, BER
Procedia PDF Downloads 3717800 A Study of Chinese-specific Terms in Government Work Report(2017-2019) from the Perspective of Relevance Theory
Authors: Shi Jiaxin
Abstract:
The Government Work Report is an essential form of document in the government of the People’s Republic of China. It covers all aspects of Chinese society and reflects China’s development strategy and trend. There are countless special terms in Government Work Report. Only by understanding Chinese-specific terms can we understand the content of the Government Work Report. Only by accurately translating the Chinese-specific terms can people come from all across the world know the Chinese government work report and understand China. Relevance theory is a popular theory of cognitive pragmatics. Relevance Translation Theory, which is closely related to Relevance Theory, has crucial and major guiding significance for the translation of Chinese-specific. Through studying Relevance Theory and researching the translation techniques, strategies and applications in the process of translating Chinese-specific terms from the perspective of Relevance Theory, we can understand the meaning and connotation of Chinese-specific terms, then solve various problems in the process of C-E translation, and strengthen our translation ability.Keywords: government work report, Chinese-specific terms, relevance theory, translation
Procedia PDF Downloads 1727799 Value from Environmental and Cultural Perspectives or Two Sides of the Same Coin
Authors: Vilem Paril, Dominika Tothova
Abstract:
This paper discusses the value theory in cultural heritage and the value theory in environmental economics. Two economic views of the value theory are compared within the field of cultural heritage maintenance and within the field of the environment. The main aims are to find common features in these two differently structured theories under the layer of differently defined terms as well as really differing features of these two approaches, to clear the confusion which stems from different terminology as in fact these terms capture the same aspects of reality and to show possible inspiration these two perspectives can offer one another. Another aim is to present these two value systems in one value framework. First, important moments of the value theory from the economic perspective are presented, leading to the marginal revolution of (not only) the Austrian School. Then the theory of value within cultural heritage and environmental economics are explored. Finally, individual approaches are compared and their potential mutual inspiration searched for.Keywords: cultural heritage, environmental economics, existence value, value theory
Procedia PDF Downloads 3227798 Effect of Phonological Complexity in Children with Specific Language Impairment
Authors: Irfana M., Priyandi Kabasi
Abstract:
Children with specific language impairment (SLI) have difficulty acquiring and using language despite having all the requirements of cognitive skills to support language acquisition. These children have normal non-verbal intelligence, hearing, and oral-motor skills, with no history of social/emotional problems or significant neurological impairment. Nevertheless, their language acquisition lags behind their peers. Phonological complexity can be considered to be the major factor that causes the inaccurate production of speech in this population. However, the implementation of various ranges of complex phonological stimuli in the treatment session of SLI should be followed for a better prognosis of speech accuracy. Hence there is a need to study the levels of phonological complexity. The present study consisted of 7 individuals who were diagnosed with SLI and 10 developmentally normal children. All of them were Hindi speakers with both genders and their age ranged from 4 to 5 years. There were 4 sets of stimuli; among them were minimal contrast vs maximal contrast nonwords, minimal coarticulation vs maximal coarticulation nonwords, minimal contrast vs maximal contrast words and minimal coarticulation vs maximal coarticulation words. Each set contained 10 stimuli and participants were asked to repeat each stimulus. Results showed that production of maximal contrast was significantly accurate, followed by minimal coarticulation, minimal contrast and maximal coarticulation. A similar trend was shown for both word and non-word categories of stimuli. The phonological complexity effect was evident in the study for each participant group. Moreover, present study findings can be implemented for the management of SLI, specifically for the selection of stimuli.Keywords: coarticulation, minimal contrast, phonological complexity, specific language impairment
Procedia PDF Downloads 1437797 Pricing European Options under Jump Diffusion Models with Fast L-stable Padé Scheme
Authors: Salah Alrabeei, Mohammad Yousuf
Abstract:
The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. Modeling option pricing by Black-School models with jumps guarantees to consider the market movement. However, only numerical methods can solve this model. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, the exponential time differencing (ETD) method is applied for solving partial integrodifferential equations arising in pricing European options under Merton’s and Kou’s jump-diffusion models. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). A partial fraction form of Pad`e schemes is used to overcome the complexity of inverting polynomial of matrices. These two tools guarantee to get efficient and accurate numerical solutions. We construct a parallel and easy to implement a version of the numerical scheme. Numerical experiments are given to show how fast and accurate is our scheme.Keywords: Integral differential equations, , L-stable methods, pricing European options, Jump–diffusion model
Procedia PDF Downloads 1537796 Theoretical Paradigms for Total Quality Environmental Management (TQEM)
Authors: Mohammad Hossein Khasmafkan Nezam, Nader Chavoshi Boroujeni, Mohamad Reza Veshaghi
Abstract:
Quality management is dominated by rational paradigms for the measurement and management of quality, but these paradigms start to ‘break down’, when faced with the inherent complexity of managing quality in intensely competitive changing environments. In this article, the various theoretical paradigms employed to manage quality are reviewed and the advantages and limitations of these paradigms are highlighted. A major implication of this review is that when faced with complexity, an ideological stance to any single strategy paradigm for total quality environmental management is ineffective. We suggest that as complexity increases and we envisage intensely competitive changing environments there will be a greater need to consider a multi-paradigm integrationist view of strategy for TQEM.Keywords: total quality management (TQM), total quality environmental management (TQEM), ideologies (philosophy), theoretical paradigms
Procedia PDF Downloads 3217795 An Axiomatic Approach to Constructing an Applied Theory of Possibility
Authors: Oleksii Bychkov
Abstract:
The fundamental difference between randomness and vagueness is that the former requires statistical research. These issues were studied by Zadeh L, Dubois D., Prad A. The theory of possibility works with expert assessments, hypotheses, etc. gives an idea of the characteristics of the problem situation, the nature of the goals and real limitations. Possibility theory examines experiments that are not repeated. The article discusses issues related to the formalization of a fuzzy, uncertain idea of reality. The author proposes to expand the classical model of the theory of possibilities by introducing a measure of necessity. The proposed model of the theory of possibilities allows us to extend the measures of possibility and necessity onto a Boolean while preserving the properties of the measure. Thus, upper and lower estimates are obtained to describe the fact that the event will occur. Knowledge of the patterns that govern mass random, uncertain, fuzzy events allows us to predict how these events will proceed. The article proposed for publication quite fully reveals the essence of the construction and use of the theory of probability and the theory of possibility.Keywords: possibility, artificial, modeling, axiomatics, intellectual approach
Procedia PDF Downloads 357794 Nonlinear Modelling of Sloshing Waves and Solitary Waves in Shallow Basins
Authors: Mohammad R. Jalali, Mohammad M. Jalali
Abstract:
The earliest theories of sloshing waves and solitary waves based on potential theory idealisations and irrotational flow have been extended to be applicable to more realistic domains. To this end, the computational fluid dynamics (CFD) methods are widely used. Three-dimensional CFD methods such as Navier-Stokes solvers with volume of fluid treatment of the free surface and Navier-Stokes solvers with mappings of the free surface inherently impose high computational expense; therefore, considerable effort has gone into developing depth-averaged approaches. Examples of such approaches include Green–Naghdi (GN) equations. In Cartesian system, GN velocity profile depends on horizontal directions, x-direction and y-direction. The effect of vertical direction (z-direction) is also taken into consideration by applying weighting function in approximation. GN theory considers the effect of vertical acceleration and the consequent non-hydrostatic pressure. Moreover, in GN theory, the flow is rotational. The present study illustrates the application of GN equations to propagation of sloshing waves and solitary waves. For this purpose, GN equations solver is verified for the benchmark tests of Gaussian hump sloshing and solitary wave propagation in shallow basins. Analysis of the free surface sloshing of even harmonic components of an initial Gaussian hump demonstrates that the GN model gives predictions in satisfactory agreement with the linear analytical solutions. Discrepancies between the GN predictions and the linear analytical solutions arise from the effect of wave nonlinearities arising from the wave amplitude itself and wave-wave interactions. Numerically predicted solitary wave propagation indicates that the GN model produces simulations in good agreement with the analytical solution of the linearised wave theory. Comparison between the GN model numerical prediction and the result from perturbation analysis confirms that nonlinear interaction between solitary wave and a solid wall is satisfactorilly modelled. Moreover, solitary wave propagation at an angle to the x-axis and the interaction of solitary waves with each other are conducted to validate the developed model.Keywords: Green–Naghdi equations, nonlinearity, numerical prediction, sloshing waves, solitary waves
Procedia PDF Downloads 2877793 A Computational Study of the Electron Transport in HgCdTe Bulk Semiconductor
Abstract:
This paper deals with the use of computational method based on Monte Carlo simulation in order to investigate the transport phenomena of the electron in HgCdTe narrow band gap semiconductor. Via this method we can evaluate the time dependence of the transport parameters: velocity, energy and mobility of electrons through matter (HgCdTe).Keywords: Monte Carlo, transport parameters, HgCdTe, computational mechanics
Procedia PDF Downloads 4757792 A Hybrid Classical-Quantum Algorithm for Boundary Integral Equations of Scattering Theory
Authors: Damir Latypov
Abstract:
A hybrid classical-quantum algorithm to solve boundary integral equations (BIE) arising in problems of electromagnetic and acoustic scattering is proposed. The quantum speed-up is due to a Quantum Linear System Algorithm (QLSA). The original QLSA of Harrow et al. provides an exponential speed-up over the best-known classical algorithms but only in the case of sparse systems. Due to the non-local nature of integral operators, matrices arising from discretization of BIEs, are, however, dense. A QLSA for dense matrices was introduced in 2017. Its runtime as function of the system's size N is bounded by O(√Npolylog(N)). The run time of the best-known classical algorithm for an arbitrary dense matrix scales as O(N².³⁷³). Instead of exponential as in case of sparse matrices, here we have only a polynomial speed-up. Nevertheless, sufficiently high power of this polynomial, ~4.7, should make QLSA an appealing alternative. Unfortunately for the QLSA, the asymptotic separability of the Green's function leads to high compressibility of the BIEs matrices. Classical fast algorithms such as Multilevel Fast Multipole Method (MLFMM) take advantage of this fact and reduce the runtime to O(Nlog(N)), i.e., the QLSA is only quadratically faster than the MLFMM. To be truly impactful for computational electromagnetics and acoustics engineers, QLSA must provide more substantial advantage than that. We propose a computational scheme which combines elements of the classical fast algorithms with the QLSA to achieve the required performance.Keywords: quantum linear system algorithm, boundary integral equations, dense matrices, electromagnetic scattering theory
Procedia PDF Downloads 1567791 Exploring the Nature and Meaning of Theory in the Field of Neuroeducation Studies
Authors: Ali Nouri
Abstract:
Neuroeducation is one of the most exciting research fields which is continually evolving. However, there is a need to develop its theoretical bases in connection to practice. The present paper is a starting attempt in this regard to provide a space from which to think about neuroeducational theory and invoke more investigation in this area. Accordingly, a comprehensive theory of neuroeducation could be defined as grouping or clustering of concepts and propositions that describe and explain the nature of human learning to provide valid interpretations and implications useful for educational practice in relation to philosophical aspects or values. Whereas it should be originated from the philosophical foundations of the field and explain its normative significance, it needs to be testable in terms of rigorous evidence to fundamentally advance contemporary educational policy and practice. There is thus pragmatically a need to include a course on neuroeducational theory into the curriculum of the field. In addition, there is a need to articulate and disseminate considerable discussion over the subject within professional journals and academic societies.Keywords: neuroeducation studies, neuroeducational theory, theory building, neuroeducation research
Procedia PDF Downloads 4497790 Pseudo Modal Operating Deflection Shape Based Estimation Technique of Mode Shape Using Time History Modal Assurance Criterion
Authors: Doyoung Kim, Hyo Seon Park
Abstract:
Studies of System Identification(SI) based on Structural Health Monitoring(SHM) have actively conducted for structural safety. Recently SI techniques have been rapidly developed with output-only SI paradigm for estimating modal parameters. The features of these output-only SI methods consist of Frequency Domain Decomposition(FDD) and Stochastic Subspace Identification(SSI) are using the algorithms based on orthogonal decomposition such as singular value decomposition(SVD). But the SVD leads to high level of computational complexity to estimate modal parameters. This paper proposes the technique to estimate mode shape with lower computational cost. This technique shows pseudo modal Operating Deflections Shape(ODS) through bandpass filter and suggests time history Modal Assurance Criterion(MAC). Finally, mode shape could be estimated from pseudo modal ODS and time history MAC. Analytical simulations of vibration measurement were performed and the results with mode shape and computation time between representative SI method and proposed method were compared.Keywords: modal assurance criterion, mode shape, operating deflection shape, system identification
Procedia PDF Downloads 4107789 Towards Establishing a Universal Theory of Project Management
Authors: Divine Kwaku Ahadzie
Abstract:
Project management (PM) as a concept has evolved from the early 20th Century into a recognized academic and professional discipline, and indications are that it has come to stay in the 21st Century as a world-wide paradigm shift for managing successful construction projects. However, notwithstanding the strong inroads that PM has made in legitimizing its academic and professional status in construction management practice, the underlining philosophies are still based on cases and conventional practices. An important theoretical issue yet to be addressed is the lack of a universal theory that offers philosophical legitimacy for the PM concept as a uniquely specialized management concept. Here, it is hypothesized that the law of entropy, the theory of uncertainties and the theory of risk management offer plausible explanations for addressing the lacuna of what constitute PM theory. The theoretical bases of these plausible underlying theories are argued and attempts made to establish the functional relationships that exist between these theories and the PM concept. The paper then draws on data related to the success and/or failure of a number of construction projects to validate the theory.Keywords: concepts, construction, project management, universal theory
Procedia PDF Downloads 3287788 Walking the Tightrope: Balancing Project Governance, Complexity, and Servant Leadership for Megaproject Success
Authors: Muhammad Shoaib Iqbal, Shih Ping Ho
Abstract:
Megaprojects are large-scale, complex ventures with significant financial investments, numerous stakeholders, and extended timelines, requiring meticulous management for successful completion. This study explores the interplay between project governance, project complexity, and servant leadership and their combined effects on project success, specifically within the context of Pakistani megaprojects. The primary objectives are to examine the direct impact of project governance on project success, understand the negative influence of project complexity, assess the positive role of servant leadership, explore the moderating effect of servant leadership on the relationship between governance and success, and investigate how servant leadership mitigates the adverse effects of complexity. Using a quantitative approach, survey data were collected from project managers and team members involved in Pakistani megaprojects. Using a Comprehensive empirical model, 257 Valid responses were analyzed. Multiple regression analysis tested the hypothesized relationships and interaction effects using PLS-SEM. Findings reveal that project governance significantly enhances project success, emphasizing the need for robust governance structures. Conversely, project complexity negatively impacts success, highlighting the challenges of managing complex projects. Servant leadership significantly boosts project success by prioritizing team support and empowerment. Although the interaction between governance and servant leadership is not significant, suggesting no significant change in project success, servant leadership significantly mitigates the negative effects of project complexity, enhancing team resilience and adaptability. These results underscore the necessity for a balanced approach integrating strong governance with flexible, supportive leadership. The study offers valuable insights for practitioners, recommending adaptive governance frameworks and promoting servant leadership to improve the management and success rates of megaprojects. This research contributes to the broader understanding of effective project management practices in complex environments.Keywords: project governance, project complexity, servant leadership, project success, megaprojects, Pakistan
Procedia PDF Downloads 377787 Exploring the Applications of Neural Networks in the Adaptive Learning Environment
Authors: Baladitya Swaika, Rahul Khatry
Abstract:
Computer Adaptive Tests (CATs) is one of the most efficient ways for testing the cognitive abilities of students. CATs are based on Item Response Theory (IRT) which is based on item selection and ability estimation using statistical methods of maximum information selection/selection from posterior and maximum-likelihood (ML)/maximum a posteriori (MAP) estimators respectively. This study aims at combining both classical and Bayesian approaches to IRT to create a dataset which is then fed to a neural network which automates the process of ability estimation and then comparing it to traditional CAT models designed using IRT. This study uses python as the base coding language, pymc for statistical modelling of the IRT and scikit-learn for neural network implementations. On creation of the model and on comparison, it is found that the Neural Network based model performs 7-10% worse than the IRT model for score estimations. Although performing poorly, compared to the IRT model, the neural network model can be beneficially used in back-ends for reducing time complexity as the IRT model would have to re-calculate the ability every-time it gets a request whereas the prediction from a neural network could be done in a single step for an existing trained Regressor. This study also proposes a new kind of framework whereby the neural network model could be used to incorporate feature sets, other than the normal IRT feature set and use a neural network’s capacity of learning unknown functions to give rise to better CAT models. Categorical features like test type, etc. could be learnt and incorporated in IRT functions with the help of techniques like logistic regression and can be used to learn functions and expressed as models which may not be trivial to be expressed via equations. This kind of a framework, when implemented would be highly advantageous in psychometrics and cognitive assessments. This study gives a brief overview as to how neural networks can be used in adaptive testing, not only by reducing time-complexity but also by being able to incorporate newer and better datasets which would eventually lead to higher quality testing.Keywords: computer adaptive tests, item response theory, machine learning, neural networks
Procedia PDF Downloads 1767786 Problems of Boolean Reasoning Based Biclustering Parallelization
Authors: Marcin Michalak
Abstract:
Biclustering is the way of two-dimensional data analysis. For several years it became possible to express such issue in terms of Boolean reasoning, for processing continuous, discrete and binary data. The mathematical backgrounds of such approach — proved ability of induction of exact and inclusion–maximal biclusters fulfilling assumed criteria — are strong advantages of the method. Unfortunately, the core of the method has quite high computational complexity. In the paper the basics of Boolean reasoning approach for biclustering are presented. In such context the problems of computation parallelization are risen.Keywords: Boolean reasoning, biclustering, parallelization, prime implicant
Procedia PDF Downloads 1257785 Policy Compliance in Information Security
Authors: R. Manjula, Kaustav Bagchi, Sushant Ramesh, Anush Baskaran
Abstract:
In the past century, the emergence of information technology has had a significant positive impact on human life. While companies tend to be more involved in the completion of projects, the turn of the century has seen importance being given to investment in information security policies. These policies are essential to protect important data from adversaries, and thus following these policies has become one of the most important attributes revolving around information security models. In this research, we have focussed on the factors affecting information security policy compliance in two models : The theory of planned behaviour and the integration of the social bond theory and the involvement theory into a single model. Finally, we have given a proposal of where these theories would be successful.Keywords: information technology, information security, involvement theory, policies, social bond theory
Procedia PDF Downloads 3717784 Gaussian Particle Flow Bernoulli Filter for Single Target Tracking
Authors: Hyeongbok Kim, Lingling Zhao, Xiaohong Su, Junjie Wang
Abstract:
The Bernoulli filter is a precise Bayesian filter for single target tracking based on the random finite set theory. The standard Bernoulli filter often underestimates the number of targets. This study proposes a Gaussian particle flow (GPF) Bernoulli filter employing particle flow to migrate particles from prior to posterior positions to improve the performance of the standard Bernoulli filter. By employing the particle flow filter, the computational speed of the Bernoulli filters is significantly improved. In addition, the GPF Bernoulli filter provides a more accurate estimation compared with that of the standard Bernoulli filter. Simulation results confirm the improved tracking performance and computational speed in two- and three-dimensional scenarios compared with other algorithms.Keywords: Bernoulli filter, particle filter, particle flow filter, random finite sets, target tracking
Procedia PDF Downloads 927783 Musical Composition by Computer with Inspiration from Files of Different Media Types
Authors: Cassandra Pratt Romero, Andres Gomez de Silva Garza
Abstract:
This paper describes a computational system designed to imitate human inspiration during musical composition. The system is called MIS (Musical Inspiration Simulator). The MIS system is inspired by media to which human beings are exposed daily (visual, textual, or auditory) to create new musical compositions based on the emotions detected in said media. After building the system we carried out a series of evaluations with volunteer users who used MIS to compose music based on images, texts, and audio files. The volunteers were asked to judge the harmoniousness and innovation in the system's compositions. An analysis of the results points to the difficulty of computational analysis of the characteristics of the media to which we are exposed daily, as human emotions have a subjective character. This observation will direct future improvements in the system.Keywords: human inspiration, musical composition, musical composition by computer, theory of sensation and human perception
Procedia PDF Downloads 1857782 Chaotic Behavior in Monetary Systems: Comparison among Different Types of Taylor Rule
Authors: Reza Moosavi Mohseni, Wenjun Zhang, Jiling Cao
Abstract:
The aim of the present study is to detect the chaotic behavior in monetary economic relevant dynamical system. The study employs three different forms of Taylor rules: current, forward, and backward looking. The result suggests the existence of the chaotic behavior in all three systems. In addition, the results strongly represent that using expectations especially rational expectation hypothesis can increase complexity of the system and leads to more chaotic behavior.Keywords: taylor rule, monetary system, chaos theory, lyapunov exponent, GMM estimator
Procedia PDF Downloads 5307781 CompleX-Machine: An Automated Testing Tool Using X-Machine Theory
Authors: E. K. A. Ogunshile
Abstract:
This paper is aimed at creating an Automatic Java X-Machine testing tool for software development. The nature of software development is changing; thus, the type of software testing tools required is also changing. Software is growing increasingly complex and, in part due to commercial impetus for faster software releases with new features and value, increasingly in danger of containing faults. These faults can incur huge cost for software development organisations and users; Cambridge Judge Business School’s research estimated the cost of software bugs to the global economy is $312 billion. Beyond the cost, faster software development methodologies and increasing expectations on developers to become testers is driving demand for faster, automated, and effective tools to prevent potential faults as early as possible in the software development lifecycle. Using X-Machine theory, this paper will explore a new tool to address software complexity, changing expectations on developers, faster development pressures and methodologies, with a view to reducing the huge cost of fixing software bugs.Keywords: conformance testing, finite state machine, software testing, x-machine
Procedia PDF Downloads 268