Search results for: elaboration likelihood model theory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20706

Search results for: elaboration likelihood model theory

19836 Nonlinear Propagation of Acoustic Soliton Waves in Dense Quantum Electron-Positron Magnetoplasma

Authors: A. Abdikian

Abstract:

Propagation of nonlinear acoustic wave in dense electron-positron (e-p) plasmas in the presence of an external magnetic field and stationary ions (to neutralize the plasma background) is studied. By means of the quantum hydrodynamics model and applying the reductive perturbation method, the Zakharov-Kuznetsov equation is derived. Using the bifurcation theory of planar dynamical systems, the compressive structure of electrostatic solitary wave and periodic travelling waves is found. The numerical results show how the ion density ratio, the ion cyclotron frequency, and the direction cosines of the wave vector affect the nonlinear electrostatic travelling waves. The obtained results may be useful to better understand the obliquely nonlinear electrostatic travelling wave of small amplitude localized structures in dense magnetized quantum e-p plasmas and may be applicable to study the particle and energy transport mechanism in compact stars such as the interior of massive white dwarfs etc.

Keywords: bifurcation theory, phase portrait, magnetized electron-positron plasma, the Zakharov-Kuznetsov equation

Procedia PDF Downloads 245
19835 An Interpretive Study of Entrepreneurial Experience towards Achieving Business Growth Using the Theory of Planned Behaviour as a Lens

Authors: Akunna Agunwah, Kevin Gallimore, Kathryn Kinmond

Abstract:

Entrepreneurship is widely associated and seen as a vehicle for economic growth; however, different scholars have studied entrepreneurship from various perspectives, resulting in multiple definitions. It is surprising to know most entrepreneurship definition does not incorporate growth as part of their definition of entrepreneurship. Economic growth is engineered by the activities of the entrepreneurs. The purpose of the present theoretical study is to explore the working practices of the successful entrepreneurs towards achieving business growth by understanding the experiences of the entrepreneur using the Theory of Planned Behaviour (TPB) as a lens. Ten successful entrepreneurs in the North West of England in various business sectors were interviewed using semi-structured interview method. The recorded audio interviews transcribed and subsequently evaluated using the thematic deductive technique (qualitative approach). The themes were examined using Theory of Planned Behaviour to ascertain the presence of the three intentional antecedents (attitude, subjective norms, and perceived behavioural control). The findings categorised in two folds, firstly, it was observed that the three intentional antecedents, which make up Theory of Planned Behaviour were evident in the transcript. Secondly, the entrepreneurs are most concerned with achieving a state of freedom and realising their visions and ambitions. Nevertheless, the entrepreneur employed these intentional antecedents to enhance business growth. In conclusion, the work presented here showed a novel way of understanding the working practices and experiences of the entrepreneur using the theory of planned behaviour in qualitative approach towards enhancing business growth. There exist few qualitative studies in entrepreneurship research. In addition, this work applies a novel approach to studying the experience of the entrepreneurs by examining the working practices of the successful entrepreneurs in the North-West England through the lens of the theory of planned behaviour. Given the findings regarding TPB as a lens in the study, the entrepreneur does not differentiate between the categories of the antecedents reasonably sees them as processes that can be utilised to enhance business growth.

Keywords: business growth, experience, interpretive, theory of planned behaviour

Procedia PDF Downloads 217
19834 Songs from the Cradle: An Analysis of Some Selected Nupe Songs

Authors: Zainab Zendana Shafii

Abstract:

Lullabies have been broadly defined as songs that are sung to calm and soothe children. While this is true, this paper intends to show that lullabies exceed these functions. The paper, in exploring Nupe lullabies, examines the various functions that lullabies perform in terms of language development, cultural enrichment and also the retelling of history as it relates to the culture of the Nupe people of northern Nigeria. The theoretical framework used is the functionalist theory. This theory postulates that all cultural or social phenomena have a positive function and that all are indispensable. The functionalist theory is based on the premise that all aspects of a society—institutions, roles, norms, etc.—serve a purpose and that all are indispensable for the long-term survival of the society. To this end, this paper dissects the various lullabies in Nupeland with a view to exploring the meaning that these songs generate and why they are even sung at all. The qualitative research methodology has been used to gather materials.

Keywords: Nupe, lullabies, Nigeria, northern

Procedia PDF Downloads 200
19833 Considering International/Local Peacebuilding Partnerships: The Stoplights Analysis System

Authors: Charles Davidson

Abstract:

This paper presents the Stoplight Analysis System of Partnering Organizations Readiness, offering a structured framework to evaluate conflict resolution collaboration feasibility, especially crucial in conflict areas, employing a colour-coded approach and specific assessment points, with implications for more informed decision-making and improved outcomes in peacebuilding initiatives. Derived from at total of 40 years of practical peacebuilding experience from the project’s two researchers as well as interviews of various other peacebuilding actors, this paper introduces the Stoplight Analysis System of Partnering Organizations Readiness, a comprehensive framework designed to facilitate effective collaboration in international/local peacebuilding partnerships by evaluating the readiness of both potential partner organisations and the location of the proposed project. ^The system employs a colour-coded approach, categorising potential partnerships into three distinct indicators: Red (no-go), Yellow (requires further research), and Green (promising, go ahead). Within each category, specific points are identified for assessment, guiding decision-makers in evaluating the feasibility and potential success of collaboration. The Red category signals significant barriers, prompting an immediate stoppage in the consideration of partnership. The Yellow category encourages deeper investigation to determine whether potential issues can be mitigated, while the Green category signifies organisations deemed ready for collaboration. This systematic and structured approach empowers decision-makers to make informed choices, enhancing the likelihood of successful and mutually beneficial partnerships. Methodologically, this paper utilised interviews from peacebuilders from around the globe, scholarly research of extant strategies, and a collaborative review of programming from the project’s two authors from their own time in the field. This method as a formalised model has been employed for the past two years across a litany of partnership considerations, and has been adjusted according to its field experimentation. This research holds significant importance in the field of conflict resolution as it provides a systematic and structured approach to peacebuilding partnership evaluation. In conflict-affected regions, where the dynamics are complex and challenging, the Stoplight Analysis System offers decision-makers a practical tool to assess the readiness of partnering organisations. This approach can enhance the efficiency of conflict resolution efforts by ensuring that resources are directed towards partnerships with a higher likelihood of success, ultimately contributing to more effective and sustainable peacebuilding outcomes.

Keywords: collaboration, conflict resolution, partnerships, peacebuilding

Procedia PDF Downloads 64
19832 Service Business Model Canvas: A Boundary Object Operating as a Business Development Tool

Authors: Taru Hakanen, Mervi Murtonen

Abstract:

This study aims to increase understanding of the transition of business models in servitization. The significance of service in all business has increased dramatically during the past decades. Service-dominant logic (SDL) describes this change in the economy and questions the goods-dominant logic on which business has primarily been based in the past. A business model canvas is one of the most cited and used tools in defining end developing business models. The starting point of this paper lies in the notion that the traditional business model canvas is inherently goods-oriented and best suits for product-based business. However, the basic differences between goods and services necessitate changes in business model representations when proceeding in servitization. Therefore, new knowledge is needed on how the conception of business model and the business model canvas as its representation should be altered in servitized firms in order to better serve business developers and inter-firm co-creation. That is to say, compared to products, services are intangible and they are co-produced between the supplier and the customer. Value is always co-created in interaction between a supplier and a customer, and customer experience primarily depends on how well the interaction succeeds between the actors. The role of service experience is even stronger in service business compared to product business, as services are co-produced with the customer. This paper provides business model developers with a service business model canvas, which takes into account the intangible, interactive, and relational nature of service. The study employs a design science approach that contributes to theory development via design artifacts. This study utilizes qualitative data gathered in workshops with ten companies from various industries. In particular, key differences between Goods-dominant logic (GDL) and SDL-based business models are identified when an industrial firm proceeds in servitization. As the result of the study, an updated version of the business model canvas is provided based on service-dominant logic. The service business model canvas ensures a stronger customer focus and includes aspects salient for services, such as interaction between companies, service co-production, and customer experience. It can be used for the analysis and development of a current service business model of a company or for designing a new business model. It facilitates customer-focused new service design and service development. It aids in the identification of development needs, and facilitates the creation of a common view of the business model. Therefore, the service business model canvas can be regarded as a boundary object, which facilitates the creation of a common understanding of the business model between several actors involved. The study contributes to the business model and service business development disciplines by providing a managerial tool for practitioners in service development. It also provides research insight into how servitization challenges companies’ business models.

Keywords: boundary object, business model canvas, managerial tool, service-dominant logic

Procedia PDF Downloads 369
19831 Influence of a Company’s Dynamic Capabilities on Its Innovation Capabilities

Authors: Lovorka Galetic, Zeljko Vukelic

Abstract:

The advanced concepts of strategic and innovation management in the sphere of company dynamic and innovation capabilities, and achieving their mutual alignment and a synergy effect, are important elements in business today. This paper analyses the theory and empirically investigates the influence of a company’s dynamic capabilities on its innovation capabilities. A new multidimensional model of dynamic capabilities is presented, consisting of five factors appropriate to real time requirements, while innovation capabilities are considered pursuant to the official OECD and Eurostat standards. After examination of dynamic and innovation capabilities indicated their theoretical links, the empirical study testing the model and examining the influence of a company’s dynamic capabilities on its innovation capabilities showed significant results. In the study, a research model was posed to relate company dynamic and innovation capabilities. One side of the model features the variables that are the determinants of dynamic capabilities defined through their factors, while the other side features the determinants of innovation capabilities pursuant to the official standards. With regard to the research model, five hypotheses were set. The study was performed in late 2014 on a representative sample of large and very large Croatian enterprises with a minimum of 250 employees. The research instrument was a questionnaire administered to company top management. For both variables, the position of the company was tested in comparison to industry competitors, on a fivepoint scale. In order to test the hypotheses, correlation tests were performed to determine whether there is a correlation between each individual factor of company dynamic capabilities with the existence of its innovation capabilities, in line with the research model. The results indicate a strong correlation between a company’s possession of dynamic capabilities in terms of their factors, due to the new multi-dimensional model presented in this paper, with its possession of innovation capabilities. Based on the results, all five hypotheses were accepted. Ultimately, it was concluded that there is a strong association between the dynamic and innovation capabilities of a company. 

Keywords: dynamic capabilities, innovation capabilities, competitive advantage, business results

Procedia PDF Downloads 306
19830 Towards a Multilevel System of Talent Management in Small And Medium-Sized Enterprises: French Context Exploration

Authors: Abid Kousay

Abstract:

Appeared and developed essentially in large companies and multinationals, Talent Management (TM) in Small and Medium-Sized Enterprises (SMEs) has remained an under-explored subject till today. Although the literature on TM in the Anglo-Saxon context is developing, it remains monopolized in non-European contexts, especially in France. Therefore, this article aims to address these shortcomings through contributing to TM issues, by adopting a multilevel approach holding the goal of reaching a global holistic vision of interactions between various levels, while applying TM. A qualitative research study carried out within 12 SMEs in France, built on the methodological perspective of grounded theory, will be used in order to go beyond description, to generate or discover a theory or even a unified theoretical explanation. Our theoretical contributions are the results of the grounded theory, the fruit of context considerations and the dynamic of the multilevel approach. We aim firstly to determine the perception of talent and TM in SMEs. Secondly, we formalize TM in SME through the empowerment of all 3 levels in the organization (individual, collective, and organizational). And we generate a multilevel dynamic system model, highlighting the institutionalization dimension in SMEs and the managerial conviction characterized by the domination of the leader's role. Thirdly, this first study shed the light on the importance of rigorous implementation of TM in SMEs in France by directing CEO and HR and TM managers to focus on elements that upstream TM implementation and influence the system internally. Indeed, our systematic multilevel approach policy reminds them of the importance of the strategic alignment while translating TM policy into strategies and practices in SMEs.

Keywords: French context, institutionalization, talent, multilevel approach, talent management system

Procedia PDF Downloads 202
19829 Optimization of Temperature Difference Formula at Thermoacoustic Cryocooler Stack with Genetic Algorithm

Authors: H. Afsari, H. Shokouhmand

Abstract:

When stack is placed in a thermoacoustic resonator in a cryocooler, one extremity of the stack heats up while the other cools down due to the thermoacoustic effect. In the present, with expression a formula by linear theory, will see this temperature difference depends on what factors. The computed temperature difference is compared to the one predicted by the formula. These discrepancies can not be attributed to non-linear effects, rather they exist because of thermal effects. Two correction factors are introduced for close up results among linear theory and computed and use these correction factors to modified linear theory. In fact, this formula, is optimized by GA (Genetic Algorithm). Finally, results are shown at different Mach numbers and stack location in resonator.

Keywords: heat transfer, thermoacoustic cryocooler, stack, resonator, mach number, genetic algorithm

Procedia PDF Downloads 380
19828 Free Will and Compatibilism in Decision Theory: A Solution to Newcomb’s Paradox

Authors: Sally Heyeon Hwang

Abstract:

Within decision theory, there are normative principles that dictate how one should act in addition to empirical theories of actual behavior. As a normative guide to one’s actual behavior, evidential or causal decision-theoretic equations allow one to identify outcomes with maximal utility values. The choice that each person makes, however, will, of course, differ according to varying assignments of weight and probability values. Regarding these different choices, it remains a subject of considerable philosophical controversy whether individual subjects have the capacity to exercise free will with respect to the assignment of probabilities, or whether instead the assignment is in some way constrained. A version of this question is given a precise form in Richard Jeffrey’s assumption that free will is necessary for Newcomb’s paradox to count as a decision problem. This paper will argue, against Jeffrey, that decision theory does not require the assumption of libertarian freedom. One of the hallmarks of decision-making is its application across a wide variety of contexts; the implications of a background assumption of free will is similarly varied. One constant across the contexts of decision is that there are always at least two levels of choice for a given agent, depending on the degree of prior constraint. Within the context of Newcomb’s problem, when the predictor is attempting to guess the choice the agent will make, he or she is analyzing the determined aspects of the agent such as past characteristics, experiences, and knowledge. On the other hand, as David Lewis’ backtracking argument concerning the relationship between past and present events brings to light, there are similarly varied ways in which the past can actually be dependent on the present. One implication of this argument is that even in deterministic settings, an agent can have more free will than it may seem. This paper will thus argue against the view that a stable background assumption of free will or determinism in decision theory is necessary, arguing instead for a compatibilist decision theory yielding a novel treatment of Newcomb’s problem.

Keywords: decision theory, compatibilism, free will, Newcomb’s problem

Procedia PDF Downloads 322
19827 The Roman Fora in North Africa Towards a Supportive Protocol to the Decision for the Morphological Restitution

Authors: Dhouha Laribi Galalou, Najla Allani Bouhoula, Atef Hammouda

Abstract:

This research delves into the fundamental question of the morphological restitution of built archaeology in order to place it in its paradigmatic context and to seek answers to it. Indeed, the understanding of the object of the study, its analysis, and the methodology of solving the morphological problem posed, are manageable aspects only by means of a thoughtful strategy that draws on well-defined epistemological scaffolding. In this stream, the crisis of natural reasoning in archaeology has generated multiple changes in this field, ranging from the use of new tools to the integration of an archaeological information system where urbanization involves the interplay of several disciplines. The built archaeological topic is also an architectural and morphological object. It is also a set of articulated elementary data, the understanding of which is about to be approached from a logicist point of view. Morphological restitution is no exception to the rule, and the inter-exchange between the different disciplines uses the capacity of each to frame the reflection on the incomplete elements of a given architecture or on its different phases and multiple states of existence. The logicist sequence is furnished by the set of scattered or destroyed elements found, but also by what can be called a rule base which contains the set of rules for the architectural construction of the object. The knowledge base built from the archaeological literature also provides a reference that enters into the game of searching for forms and articulations. The choice of the Roman Forum in North Africa is justified by the great urban and architectural characteristics of this entity. The research on the forum involves both a fairly large knowledge base but also provides the researcher with material to study - from a morphological and architectural point of view - starting from the scale of the city down to the architectural detail. The experimentation of the knowledge deduced on the paradigmatic level, as well as the deduction of an analysis model, is then carried out on the basis of a well-defined context which contextualises the experimentation from the elaboration of the morphological information container attached to the rule base and the knowledge base. The use of logicist analysis and artificial intelligence has allowed us to first question the aspects already known in order to measure the credibility of our system, which remains above all a decision support tool for the morphological restitution of Roman Fora in North Africa. This paper presents a first experimentation of the model elaborated during this research, a model framed by a paradigmatic discussion and thus trying to position the research in relation to the existing paradigmatic and experimental knowledge on the issue.

Keywords: classical reasoning, logicist reasoning, archaeology, architecture, roman forum, morphology, calculation

Procedia PDF Downloads 149
19826 Stabilizing Effect of Magnetic Field in a Thermally Modulated Porous Layer

Authors: M. Meenasaranya, S. Saravanan

Abstract:

Nonlinear stability analysis is carried out to determine the effect of surface temperature modulation in an infinite horizontal porous layer heated from below. The layer is saturated by an electrically conducting, viscous, incompressible and Newtonian fluid. The Brinkman model is used for momentum equation, and the Boussinesq approximation is invoked. The system is assumed to be bounded by rigid boundaries. The energy theory is implemented to find the global exponential stability region of the considered system. The results are analysed for arbitrary values of modulation frequency and amplitude. The existence of subcritical instability region is confirmed by comparing the obtained result with the known linear result. The vertical magnetic field is found to stabilize the system.

Keywords: Brinkman model, energy method, magnetic field, surface temperature modulation

Procedia PDF Downloads 396
19825 Mathematical Based Forecasting of Heart Attack

Authors: Razieh Khalafi

Abstract:

Myocardial infarction (MI) or acute myocardial infarction (AMI), commonly known as a heart attack, occurs when blood flow stops to part of the heart causing damage to the heart muscle. An ECG can often show evidence of a previous heart attack or one that's in progress. The patterns on the ECG may indicate which part of your heart has been damaged, as well as the extent of the damage. In chaos theory, the correlation dimension is a measure of the dimensionality of the space occupied by a set of random points, often referred to as a type of fractal dimension. In this research by considering ECG signal as a random walk we work on forecasting the oncoming heart attack by analyzing the ECG signals using the correlation dimension. In order to test the model a set of ECG signals for patients before and after heart attack was used and the strength of model for forecasting the behavior of these signals were checked. Results shows this methodology can forecast the ECG and accordingly heart attack with high accuracy.

Keywords: heart attack, ECG, random walk, correlation dimension, forecasting

Procedia PDF Downloads 543
19824 A Conceptual Framework for Integrating Musical Instrument Digital Interface Composition in the Music Classroom

Authors: Aditi Kashi

Abstract:

While educational technologies have taken great strides, especially in Musical Instrument Digital Interface (MIDI) composition, teachers across the world are still adjusting to incorporate such technology into their curricula. While using MIDI in the classroom has become more common, limited class time and a strong focus on performance have made composition a lesser priority. The balance between music theory, performance time, and composition learning is delicate and difficult to maintain for many music educators. This makes including MIDI in the classroom. To address this issue, this paper aims to outline a general conceptual framework centered around a key element of music theory to integrate MIDI composition into the music classroom to not only introduce students to digital composition but also enhance their understanding of music theory and its applicability.

Keywords: educational framework, education technology, MIDI, music education

Procedia PDF Downloads 87
19823 Empirical Evaluation of Game Components Based on Learning Theory: A Preliminary Study

Authors: Seoi Lee, Dongjoo Chin, Heewon Kim

Abstract:

Gamification refers to a technique that applies game elements to non-gaming elements, such as education and exercise, to make people more engaged in these behaviors. The purpose of this study was to identify effective elements in gamification for changing human behaviors. In order to accomplish this purpose, a survey based on learning theory was developed, especially for assessing antecedents and consequences of behaviors, and 8 popular and 8 unpopular games were selected for comparison. A total of 407 adult males and females were recruited via crowdsourcing Internet marketplace and completed the survey, which consisted of 19 questions for antecedent and 14 questions for consequences. Results showed no significant differences in consequence questions between popular and unpopular games. For antecedent questions, popular games are superior to unpopular games in character customization, play type selection, a sense of belonging, patch update cycle, and influence or dominance. This study is significant in that it reveals the elements of gamification based on learning theory. Future studies need to empirically validate whether these factors affect behavioral change.

Keywords: gamification, learning theory, antecedent, consequence, behavior change, behaviorism

Procedia PDF Downloads 224
19822 Parallels Between Indian Art Music and Western Art Music: The Suppression of the Notion of the 'Melody'

Authors: Kedarnath Awati

Abstract:

Some parallels between Indian Art Music and Western Art Music, such as the identity of the basic heptatonic scale structure, are quite obvious and need no further discussion. Other parallels are far less obvious, and it is one of them that the author is interested in. Specifically, the author would like to make a serious claim that in both types of music, there is an unspoken dependence on melody. Yes, it is true that the techniques that the two systems use for elaboration are very, very different: Western music uses the techniques of harmony, counterpoint, orchestration and motivic variation, while the Indian systems, both the Hindustani and the Carnatic traditions use the technique of raagdaari. The reason that this point is barely spoken about is that both in the West as well as in India, artists tend to think of melody as something elementary or as something 'given'. The Indian musicians would much rather dwell upon this or that meend or taan or other technical device, while the West thinks that melody is passé and would rather discuss the merits and demerits of spectralism and perhaps serialism. The author would like to explore this theme further in his paper.

Keywords: Indian art music, Western art music, melody, raagdaari, motivic variation.

Procedia PDF Downloads 65
19821 Global Mittag-Leffler Stability of Fractional-Order Bidirectional Associative Memory Neural Network with Discrete and Distributed Transmission Delays

Authors: Swati Tyagi, Syed Abbas

Abstract:

Fractional-order Hopfield neural networks are generally used to model the information processing among the interacting neurons. To show the constancy of the processed information, it is required to analyze the stability of these systems. In this work, we perform Mittag-Leffler stability for the corresponding Caputo fractional-order bidirectional associative memory (BAM) neural networks with various time-delays. We derive sufficient conditions to ensure the existence and uniqueness of the equilibrium point by using the theory of topological degree theory. By applying the fractional Lyapunov method and Mittag-Leffler functions, we derive sufficient conditions for the global Mittag-Leffler stability, which further imply the global asymptotic stability of the network equilibrium. Finally, we present two suitable examples to show the effectiveness of the obtained results.

Keywords: bidirectional associative memory neural network, existence and uniqueness, fractional-order, Lyapunov function, Mittag-Leffler stability

Procedia PDF Downloads 366
19820 Objectifying Media and Preadolescents' Media Internalization: A Developmental Perspective

Authors: Ann Rousseau, Steven Eggermont

Abstract:

The current study sought to explain pre-adolescents’ differential susceptibility to the internalization of mediated appearance ideals, using a three-wave panel survey of preadolescent girls and boys (N = 973, Mage = 11.14). Based on the premises of objectification theory and sexual script theory, we proposed a double role for pubertal timing and cross-sex interactions in preadolescents’ media internalization. More specifically, we expected pubertal timing and cross-sex interactions to (a) trigger higher levels of media internalization, directly and indirectly via body surveillance, and (b) positively moderate the relationship between objectifying media exposure and girls’ and boys’ media internalization. A first cross-lagged model tested whether the pubertal timing and cross-sex interactions could trigger preadolescents media internalization and body surveillance. Structural equation analysis indicated that pubertal timing (Wave1) positively predicted body surveillance and media internalization (both Wave3). Cross-sex involvement (Wave1) was positively linked to media internalization (Wave2), but body surveillance (Wave2) was not associated with cross-sex interactions. Results also showed a reciprocal interaction between media internalization (Wave 2 and 3) and body surveillance (Wave2 and 3). Multiple group analysis showed that the observed relationships did not vary by gender. A second moderated moderation model examined whether (a) the relationship between objectifying media exposure (television and magazines, both Wave1) and media internalization (Wave3) depended on pubertal timing (Wave1), and (b) the two-way interaction between objectifying media exposure (Wave1) and pubertal timing (Wave1) varied depending on cross-sex interactions (Wave1). Results revealed that cross-sex interactions functioned as a buffer against media internalization. For preadolescents who had fewer cross-sex interactions, early puberty (relative to peers) positively moderated the relationship between magazine exposure and the internalization of mediated appearance ideals. No significant relationships were found for television. Again, no gender difference could be observed. The present study suggests a double role for pubertal timing and cross-sex interactions in preadolescents media internalization, and indicate that early developers with few cross-sex experiences are particularly vulnerable for media internalization. Additionally, the current findings suggest that there is relative gender equity in magazines’ ability to cultivate media internalization among preadolescents.

Keywords: cross-sex interactions, media effects, objectification theory, pubertal timing

Procedia PDF Downloads 329
19819 Numerical Simulation of Fiber Bragg Grating Spectrum for Mode-І Delamination Detection

Authors: O. Hassoon, M. Tarfoui, A. El Malk

Abstract:

Fiber Bragg optic sensor embedded in composite material to detect and monitor the damage which is occur in composite structure. In this paper we deal with the mode-Ι delamination to determine the resistance of material to crack propagation, and use the coupling mode theory and T-matrix method to simulating the FBGs spectrum for both uniform and non-uniform strain distribution. The double cantilever beam test which is modeling in FEM to determine the Longitudinal strain, there are two models which are used, the first is the global half model, and the second the sub-model to represent the FBGs with refine mesh. This method can simulate the damage in the composite structure and converting the strain to wavelength shifting of the FBG spectrum.

Keywords: fiber bragg grating, delamination detection, DCB, FBG spectrum, structure health monitoring

Procedia PDF Downloads 365
19818 [Keynote Talk]: Evidence Fusion in Decision Making

Authors: Mohammad Abdullah-Al-Wadud

Abstract:

In the current era of automation and artificial intelligence, different systems have been increasingly keeping on depending on decision-making capabilities of machines. Such systems/applications may range from simple classifiers to sophisticated surveillance systems based on traditional sensors and related equipment which are becoming more common in the internet of things (IoT) paradigm. However, the available data for such problems are usually imprecise and incomplete, which leads to uncertainty in decisions made based on traditional probability-based classifiers. This requires a robust fusion framework to combine the available information sources with some degree of certainty. The theory of evidence can provide with such a method for combining evidence from different (may be unreliable) sources/observers. This talk will address the employment of the Dempster-Shafer Theory of evidence in some practical applications.

Keywords: decision making, dempster-shafer theory, evidence fusion, incomplete data, uncertainty

Procedia PDF Downloads 427
19817 The Nation in Turmoil: A Post - Colonial Critique of Mqapheli Mngdi's Cartoons

Authors: Sizwe Dlamini

Abstract:

There seems to be little that has been done to investigate cartoons from a literary criticism point of view. Cartoons have been given attention mostly in semiotics as compared to other scholarly perspectives. The aim of this article is to attempt to bridge this gap by observing cartoons through the post-colonial approach as a literary theory. Even though the post-colonial approach has been previously adopted to critique the prose genre and other genres in the African indigenous languages of South Africa, there seems to be no study that has used this approach to analyse the cartoon genre. This study is thus believed to be valuable to scientific knowledge in this sense. The study adopts textual analysis as a qualitative research technique since cartoons are the primary sources of data collection. Through the application of the post-colonial theory, the findings of the study demonstrate that there are depicted socio-cultural, socio-economic, and political issues in Mngadi’s editorial cartoons. These include.

Keywords: editorial cartoons, post-colonial theory, literary criticism, turmoil

Procedia PDF Downloads 20
19816 High Gain Broadband Plasmonic Slot Nano-Antenna

Authors: H. S. Haroyan, V. R. Tadevosyan

Abstract:

High gain broadband plasmonic slot nano-antenna has been considered. The theory of plasmonic slot nano-antenna (PSNA) has been developed. The analytical model takes into account also the electrical field inside the metal due to imperfectness of metal in optical range, as well as numerical investigation based on FEM method has been realized. It should be mentioned that Yagi-Uda configuration improves directivity in the plane of structure. In contrast, in this paper the possibility of directivity improvement of proposed PSNA in perpendicular plane of structure by using reflection metallic surface placed under the slot in fixed distance has been demonstrated. It is well known that a directivity improvement brings to the antenna gain increasing. This method of diagram improving is also well known from RF antenna design theory. Moreover the improvement of directivity in the perpendicular plane gives more flexibility in such application as improving the light and atom, ion, molecule interactions by using such type of plasmonic slot antenna. By the analogy of dipole type optical antennas the widening of working wavelengths has been realized by using bowtie geometry of slots, which made the antenna broadband.

Keywords: broadband antenna, high gain, slot nano-antenna, plasmonics.

Procedia PDF Downloads 371
19815 Evidence Theory Enabled Quickest Change Detection Using Big Time-Series Data from Internet of Things

Authors: Hossein Jafari, Xiangfang Li, Lijun Qian, Alexander Aved, Timothy Kroecker

Abstract:

Traditionally in sensor networks and recently in the Internet of Things, numerous heterogeneous sensors are deployed in distributed manner to monitor a phenomenon that often can be model by an underlying stochastic process. The big time-series data collected by the sensors must be analyzed to detect change in the stochastic process as quickly as possible with tolerable false alarm rate. However, sensors may have different accuracy and sensitivity range, and they decay along time. As a result, the big time-series data collected by the sensors will contain uncertainties and sometimes they are conflicting. In this study, we present a framework to take advantage of Evidence Theory (a.k.a. Dempster-Shafer and Dezert-Smarandache Theories) capabilities of representing and managing uncertainty and conflict to fast change detection and effectively deal with complementary hypotheses. Specifically, Kullback-Leibler divergence is used as the similarity metric to calculate the distances between the estimated current distribution with the pre- and post-change distributions. Then mass functions are calculated and related combination rules are applied to combine the mass values among all sensors. Furthermore, we applied the method to estimate the minimum number of sensors needed to combine, so computational efficiency could be improved. Cumulative sum test is then applied on the ratio of pignistic probability to detect and declare the change for decision making purpose. Simulation results using both synthetic data and real data from experimental setup demonstrate the effectiveness of the presented schemes.

Keywords: CUSUM, evidence theory, kl divergence, quickest change detection, time series data

Procedia PDF Downloads 335
19814 A Folk’s Theory of the MomConnect (mHealth) Initiative in South Africa

Authors: Eveline Muika Kabongo, Peter Delobelle, Ferdinand Mukumbang, Edward Nicol

Abstract:

Introduction: Studies have been conducted to establish the effect of the MomConnect program in South Africa, but these studies did not focus on the stakeholders' and implementers' perspectives and the underlying program theory of the MomConnect initiative program. We strived to obtain stakeholders’ perspectives and assumptions on the MomConnect program and develop an initial program theory (IPT) of how the MomConnect initiative was expected to work. Methods: A realist-informed explanatory design used. The interviewer was performed with 10 key informants selected purposively among MomConnect key informants at the a national level of NDoH South Africa. The interview was done via zoom and lasted for 30 to 60 minutes. Introduction and abduction inferencing approaches were applied. The deductive and inductive approaches were performed during the analysis. ICAMO hereustic framework was used to analysed the data in order to get key informants expectations on how the MomConnect will work or not. Results: We developed three folk’s theories illustrating how the key informants’ expected the MomConnect to work. These theories showed that the MomConnect intended to provide users with health information and education that will empower and motivate them with knowledge which will allow the improvement of health services delivery among HCPs and improvement of the uptake of MCH services among pregnant women and mothers and decrease the rate of maternal and child mortality in the country. The lack of an updated mechanism to link women to the outcome was an issue. Another problem enlightened was the introduction of the WhatsApp program instead of SMS messaging, which was free of charge to women. Conclusion: The Folk’s theory developed from this study provided an insight into how the MomConnect was expected to work and what did not work. The folk’s theory will be merged with information from candidate theories on synthesis review and document review to develop our initial program theory of the MomConnect initiative.

Keywords: mHealth, MomConnect program, realist evaluation, maternal and child health, maternal and child health services, introduction, theory-driven

Procedia PDF Downloads 199
19813 Construction and Optimization of Green Infrastructure Network in Mountainous Counties Based on Morphological Spatial Pattern Analysis and Minimum Cumulative Resistance Models: A Case Study of Shapingba District, Chongqing

Authors: Yuning Guan

Abstract:

Under the background of rapid urbanization, mountainous counties need to break through mountain barriers for urban expansion due to undulating topography, resulting in ecological problems such as landscape fragmentation and reduced biodiversity. Green infrastructure networks are constructed to alleviate the contradiction between urban expansion and ecological protection, promoting the healthy and sustainable development of urban ecosystems. This study applies the MSPA model, the MCR model and Linkage Mapper Tools to identify eco-sources and eco-corridors in the Shapingba District of Chongqing and combined with landscape connectivity assessment and circuit theory to delineate the importance levels to extract ecological pinch point areas on the corridors. The results show that: (1) 20 ecological sources are identified, with a total area of 126.47 km², accounting for 31.88% of the study area, and showing a pattern of ‘one core, three corridors, multi-point distribution’. (2) 37 ecological corridors are formed in the area, with a total length of 62.52km, with a ‘more in the west, less in the east’ pattern. (3) 42 ecological pinch points are extracted, accounting for 25.85% of the length of the corridors, which are mainly distributed in the eastern new area. Accordingly, this study proposes optimization strategies for sub-area protection of ecological sources, grade-level construction of ecological corridors, and precise restoration of ecological pinch points.

Keywords: green infrastructure network, morphological spatial pattern, minimal cumulative resistance, mountainous counties, circuit theory, shapingba district

Procedia PDF Downloads 46
19812 Diagnostic Performance of Mean Platelet Volume in the Diagnosis of Acute Myocardial Infarction: A Meta-Analysis

Authors: Kathrina Aseanne Acapulco-Gomez, Shayne Julieane Morales, Tzar Francis Verame

Abstract:

Mean platelet volume (MPV) is the most accurate measure of the size of platelets and is routinely measured by most automated hematological analyzers. Several studies have shown associations between MPV and cardiovascular risks and outcomes. Although its measurement may provide useful data, MPV remains to be a diagnostic tool that is yet to be included in routine clinical decision making. The aim of this systematic review and meta-analysis is to determine summary estimates of the diagnostic accuracy of mean platelet volume for the diagnosis of myocardial infarction among adult patients with angina and/or its equivalents in terms of sensitivity, specificity, diagnostic odds ratio, and likelihood ratios, and to determine the difference of the mean MPV values between those with MI and those in the non-MI controls. The primary search was done through search in electronic databases PubMed, Cochrane Review CENTRAL, HERDIN (Health Research and Development Information Network), Google Scholar, Philippine Journal of Pathology, and Philippine College of Physicians Philippine Journal of Internal Medicine. The reference list of original reports was also searched. Cross-sectional, cohort, and case-control articles studying the diagnostic performance of mean platelet volume in the diagnosis of acute myocardial infarction in adult patients were included in the study. Studies were included if: (1) CBC was taken upon presentation to the ER or upon admission (within 24 hours of symptom onset); (2) myocardial infarction was diagnosed with serum markers, ECG, or according to accepted guidelines by the Cardiology societies (American Heart Association (AHA), American College of Cardiology (ACC), European Society of Cardiology (ESC); and, (3) if outcomes were measured as significant difference AND/OR sensitivity and specificity. The authors independently screened for inclusion of all the identified potential studies as a result of the search. Eligible studies were appraised using well-defined criteria. Any disagreement between the reviewers was resolved through discussion and consensus. The overall mean MPV value of those with MI (9.702 fl; 95% CI 9.07 – 10.33) was higher than in those of the non-MI control group (8.85 fl; 95% CI 8.23 – 9.46). Interpretation of the calculated t-value of 2.0827 showed that there was a significant difference in the mean MPV values of those with MI and those of the non-MI controls. The summary sensitivity (Se) and specificity (Sp) for MPV were 0.66 (95% CI; 0.59 - 0.73) and 0.60 (95% CI; 0.43 – 0.75), respectively. The pooled diagnostic odds ratio (DOR) was 2.92 (95% CI; 1.90 – 4.50). The positive likelihood ratio of MPV in the diagnosis of myocardial infarction was 1.65 (95% CI; 1.20 – 22.27), and the negative likelihood ratio was 0.56 (95% CI; 0.50 – 0.64). The intended role for MPV in the diagnostic pathway of myocardial infarction would perhaps be best as a triage tool. With a DOR of 2.92, MPV values can discriminate between those who have MI and those without. For a patient with angina presenting with elevated MPV values, it is 1.65 times more likely that he has MI. Thus, it is implied that the decision to treat a patient with angina or its equivalents as a case of MI could be supported by an elevated MPV value.

Keywords: mean platelet volume, MPV, myocardial infarction, angina, chest pain

Procedia PDF Downloads 87
19811 Minimizing the Impact of Covariate Detection Limit in Logistic Regression

Authors: Shahadut Hossain, Jacek Wesolowski, Zahirul Hoque

Abstract:

In many epidemiological and environmental studies covariate measurements are subject to the detection limit. In most applications, covariate measurements are usually truncated from below which is known as left-truncation. Because the measuring device, which we use to measure the covariate, fails to detect values falling below the certain threshold. In regression analyses, it causes inflated bias and inaccurate mean squared error (MSE) to the estimators. This paper suggests a response-based regression calibration method to correct the deleterious impact introduced by the covariate detection limit in the estimators of the parameters of simple logistic regression model. Compared to the maximum likelihood method, the proposed method is computationally simpler, and hence easier to implement. It is robust to the violation of distributional assumption about the covariate of interest. In producing correct inference, the performance of the proposed method compared to the other competing methods has been investigated through extensive simulations. A real-life application of the method is also shown using data from a population-based case-control study of non-Hodgkin lymphoma.

Keywords: environmental exposure, detection limit, left truncation, bias, ad-hoc substitution

Procedia PDF Downloads 238
19810 Ordinary Differentiation Equations (ODE) Reconstruction of High-Dimensional Genetic Networks through Game Theory with Application to Dissecting Tree Salt Tolerance

Authors: Libo Jiang, Huan Li, Rongling Wu

Abstract:

Ordinary differentiation equations (ODE) have proven to be powerful for reconstructing precise and informative gene regulatory networks (GRNs) from dynamic gene expression data. However, joint modeling and analysis of all genes, essential for the systematical characterization of genetic interactions, are challenging due to high dimensionality and a complex pattern of genetic regulation including activation, repression, and antitermination. Here, we address these challenges by unifying variable selection and game theory through ODE. Each gene within a GRN is co-expressed with its partner genes in a way like a game of multiple players, each of which tends to choose an optimal strategy to maximize its “fitness” across the whole network. Based on this unifying theory, we designed and conducted a real experiment to infer salt tolerance-related GRNs for Euphrates poplar, a hero tree that can grow in the saline desert. The pattern and magnitude of interactions between several hub genes within these GRNs were found to determine the capacity of Euphrates poplar to resist to saline stress.

Keywords: gene regulatory network, ordinary differential equation, game theory, LASSO, saline resistance

Procedia PDF Downloads 640
19809 A Sequential Approach for Random-Effects Meta-Analysis

Authors: Samson Henry Dogo, Allan Clark, Elena Kulinskaya

Abstract:

The objective in meta-analysis is to combine results from several independent studies in order to create generalization and provide evidence based for decision making. But recent studies show that the magnitude of effect size estimates reported in many areas of research finding changed with year publication and this can impair the results and conclusions of meta-analysis. A number of sequential methods have been proposed for monitoring the effect size estimates in meta-analysis. However they are based on statistical theory applicable to fixed effect model (FEM). For random-effects model (REM), the analysis incorporates the heterogeneity variance, tau-squared and its estimation create complications. In this paper proposed the use of Gombay and Serbian (2005) truncated CUSUM-type test with asymptotically valid critical values for sequential monitoring of REM. Simulation results show that the test does not control the Type I error well, and is not recommended. Further work required to derive an appropriate test in this important area of application.

Keywords: meta-analysis, random-effects model, sequential test, temporal changes in effect sizes

Procedia PDF Downloads 469
19808 Modeling Approach to Better Control Fouling in a Submerged Membrane Bioreactor for Wastewater Treatment: Development of Analytical Expressions in Steady-State Using ASM1

Authors: Benaliouche Hana, Abdessemed Djamal, Meniai Abdessalem, Lesage Geoffroy, Heran Marc

Abstract:

This paper presents a dynamic mathematical model of activated sludge which is able to predict the formation and degradation kinetics of SMP (Soluble microbial products) in membrane bioreactor systems. The model is based on a calibrated version of ASM1 with the theory of production and degradation of SMP. The model was calibrated on the experimental data from MBR (Mathematical modeling Membrane bioreactor) pilot plant. Analytical expressions have been developed, describing the concentrations of the main state variables present in the sludge matrix, with the inclusion of only six additional linear differential equations. The objective is to present a new dynamic mathematical model of activated sludge capable of predicting the formation and degradation kinetics of SMP (UAP and BAP) from the submerged membrane bioreactor (BRMI), operating at low organic load (C / N = 3.5), for two sludge retention times (SRT) fixed at 40 days and 60 days, to study their impact on membrane fouling, The modeling study was carried out under the steady-state condition. Analytical expressions were then validated by comparing their results with those obtained by simulations using GPS-X-Hydromantis software. These equations made it possible, by means of modeling approaches (ASM1), to identify the operating and kinetic parameters and help to predict membrane fouling.

Keywords: Activated Sludge Model No. 1 (ASM1), mathematical modeling membrane bioreactor, soluble microbial products, UAP, BAP, Modeling SMP, MBR, heterotrophic biomass

Procedia PDF Downloads 299
19807 Maternal-Fetal Bonding for African American Mothers

Authors: Tracey Estriplet-Adams

Abstract:

This paper focuses on the influence of maternal-fetal bonding by examining attachment theory, psycho-social-cultural influences/adaptations, and maternal well-being. A systematic review methodology was used to synthesize research results to summarize current evidence that can contribute to evidence-based practices. It explores the relationship between attachment styles, prenatal attachment, and perceptions of maternal-infant bonding/attachment six weeks postpartum. It also examines the protective factors of maternal-fetal attachment development. The research explores Bowlby's attachment theory and its relevance to maternal-fetal bonding with a Black Feminist Theory lens. Additionally, it discusses the impact of perceived stress, social support, and ecological models on maternal-fetal attachment. The relationship between maternal well-being, maternal-fetal attachment, and early postpartum bonding is reviewed. Moreover, the paper specifically addresses black mothers and maternal-fetal bonding, exploring the intersectionality of race, ethnicity, class, geographic location, cultural identities, and immigration status. It considers the role of familial and partner support, as well as the relationship between maternal attachment style and maternal-fetal bonding, within the framework of attachment theory and black feminist theory. Therefore, it is imperative to center Black women's voices in research, policy, and healthcare practices. Black women are experts in their own experiences and advocate for their autonomy in decision-making regarding maternal-fetal health. By amplifying their voices, we can ensure that interventions are grounded in their lived experiences.

Keywords: maternal-fetal bonding, infant well-being, maternal-infant attachment, black mothers

Procedia PDF Downloads 76